`\
\
Click on the name of the event source, and then click **Associate with event bus** and follow the prompts to associate the event source with an event bus. After the event source is associated with an event bus, Snyk can immediately start sending events, which you can use for any actions supported by EventBridge.
## Managing and deleting an EventBridge integration
Navigate to the [EventBridge integration settings page](https://app.snyk.io/manage/integrations/aws-eventbridge) in the Snyk dashboard and click on the name of the integration you want to manage.
Select Amazon EventBridge integration
Clicking on the name of the integration opens the integration settings page, which displays configuration information for the integration.
{% hint style="info" %}
Because EventBridge integrations create an external resource that depends on the configured AWS Account ID, Region, and event type, it is not possible to edit these configuration fields. If you need to change one of these fields, delete the integration and create a new one. This deletes the existing **partner event source** in AWS and creates a new one, which you will need to associate with an **event bus** as described above.
{% endhint %}
To delete an integration, scroll to the bottom of the page and click the **Remove integration** button, then confirm the deletion.
Remove integration
This deletes the integration configuration on the Snyk side and the **Partner Event Source** associated with this integration in AWS. You can verify that the event source has been deleted in the EventBridge console.
## Understanding event data
### Snyk issue events
This event type includes core data about Snyk issues, including:
* Vulnerability type and CVE identifiers
* Issue severity
* Whether a remediation is available
Events are JSON formatted using the [Open Cybersecurity Schema Framework *finding*](https://schema.ocsf.io/1.0.0-rc.2/classes/security_finding?extensions=) schema.
{% hint style="info" %}
Not all Snyk issue data is included in these events, though Snyk is continually working to provide more complete event data.
{% endhint %}
### Snyk audit events
This event type is available with Snyk Enterprise plans. See [Plans and pricing](https://snyk.io/plans/) for details.
---
# Source: https://docs.snyk.io/integrations/snyk-studio-agentic-integrations/quickstart-guides-for-snyk-studio/amazon-q-guide.md
# Amazon Q guide
You can access Snyk Studio, including Snyk's MCP server, in Amazon Q to secure code generated with agentic workflows through an LLM. This can be achieved in several ways. When you use it for the first time, the MCP server will ask for trust and trigger authentication if necessary.
## Prerequisites
* [Install the code assistant extension](#install-amazon-q)
* [Install the Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/install-or-update-the-snyk-cli)
* [Install the Snyk MCP](#install-the-snyk-mcp-server-in-the-amazon-q-ide-extension)
### Install Amazon Q
Add the Amazon Q extension to your IDE. For more details, see the official [Installing the Amazon Q Developer extension or plugin in your IDE](https://docs.aws.amazon.com/amazonq/latest/qdeveloper-ug/q-in-IDE-setup.html).
### Install the Snyk MCP Server in the Amazon Q IDE extension
You can configure the Snyk MCP Server in VS Code and JetBrains for Amazon Q.
Using the AmazonQ extension UI in your chosen IDE:
1. Add a new MCP Server
2. Provide the following values in the specified fields:
1. Command: `npx`
2. Arguments: `-y snyk@latest mcp -t stdio -o=ostemp`
3. Timeout: `0`
The `-o` option instructs the MCP server to write the scan results to a file. To direct the results to a specific folder, provide an absolute path instead of using `ostemp`. For example, `-o=/absolute/path/to/folder`
{% hint style="info" %}
For additional MCP configuration options on Amazon Q and troubleshooting, visit the official [Amazon Q MCP server documentation](https://docs.aws.amazon.com/amazonq/latest/qdeveloper-ug/qdev-mcp.html).
{% endhint %}
## Setting up the Snyk MCP Server
As a one-time setup, you may need to authenticate and trust the current Project directory. If required, the agentic workflow will likely manage this automatically.
These workflows are mostly automatically executed by the underlying model and the agentic code assistant, and you will need to approve them using a browser confirmation dialog. The process should look similar to this:
If you need to authenticate and trust the current directory, then proceed and complete the process.
## Examples
### Scanning for security vulnerabilities
In the chat, you can tag specific files or functions and ask Amazon Q to scan your code for safely deploying to production:
{% code title="prompt" overflow="wrap" %}
```
Scan this directory for code security & dependency vulnerabilities and security issues
```
{% endcode %}
Amazon Q indicates that this request is related to security vulnerability scanning and calls the Snyk MCP Server for various scans.
Amazon Q is then able to gather all the security scanning results, both code (SAST) and dependencies (SCA), and based on recommendations and contextual information provided by the Snyk MCP Server, it is now able to provide code changes that address some or all of the security vulnerabilities that were found:
### "Secure at inception" rules
To practice [Secure at Inception](https://docs.snyk.io/discover-snyk/getting-started/glossary#secure-at-inception), Snyk recommends adopting rules that align the LLM for secure code generation and workflows.
The following is a suggested instruction for creating project rules for use with [Amazon Q Developer chat](https://docs.aws.amazon.com/amazonq/latest/qdeveloper-ug/context-project-rules.html). Append the following contents to the rules file at `.amazonq/rules/amazonq-rules.md`:
{% code title=".amazonq/rules/amazonq-rules.md" overflow="wrap" %}
```
- Always run **snyk_code_scan** tool for new first party code that is generated in a Snyk-supported language.
- If any security issues are found based on newly introduced or modified code or dependencies, attempt to fix the issues using the results context from **Snyk**.
- Rescan the code after fixing the issues to ensure that the issues were fixed and that there are no newly introduced issues.
- Repeat this process until no new issues are found.
```
{% endcode %}
---
# Source: https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/visual-studio-code-extension/view-analysis-results-from-visual-studio-code-extension/analysis-results-snyk-code.md
# Analysis results: Snyk Code
Snyk Code analysis shows security vulnerabilities and quality issues in your code with every scan.
{% hint style="info" %}
Effective beginning on June 24, 2025, Snyk Code Quality issues will no longer be provided.
{% endhint %}
## Snyk Code vulnerability window
Snyk Code vulnerability window
The Snyk suggestion panel on the right of the results screen shows the Snyk Code Vulnerability name, the line it was found in, a suggestion for a fix, and an option to ignore, either in the entire file or a specific line.
On the **Problems** tab of the Visual Studio Code results screen, you can see all Code issues found in your Project.
Visual Studio Code Problems tab
Snyk also includes a feedback mechanism to report false positives so others do not see the same issue (bottom left).
## Snyk Code editor window
The vulnerabilities are visible within the editor, with the detailed information available on hover.
Snyk Code editor window
Choose **Quick Fix** to open the details panel for an issue using Code Action.
You can also choose to ignore a suggestion, either a particular one or a recurring one in the current file, using **Quick Fix**.
Quick Fix menu
Ignore options with issue detail
---
# Source: https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/visual-studio-code-extension/view-analysis-results-from-visual-studio-code-extension/analysis-results-snyk-iac-configuration.md
# Analysis results: Snyk IaC configuration
Snyk IaC configuration analysis shows issues in your Terraform, Kubernetes, AWS CloudFormation, and Azure Resource Manager (ARM) code with every scan. Based on the Snyk CLI, the scan is fast and friendly for local development. The scan runs in the background and is enabled by default.
## Snyk IaC configuration issues window
The configuration issues window shows information about issues. By clicking on an issue, you can learn more about it:
Snyk IaC configuration issues window
The following information is shown:
* Issue description
* Issue impact
* Issue path
* Remediation details
* Links to references
In the **Problems** tab of the Visual Studio Code configuration issues screen, you can see all configuration issues found in your Project.
Problems tab
## Snyk IaC configuration editor window
The issues are visible within the editor, with the detailed information available on hover.
Snyk IaC configuration issue
Choose **Quick Fix** to open the details panel for an issue using Code Action.
Quick Fix
The details panel shows the issue name with the **Description**, **Impact** statement, **Path** by which the issue was introduced, and suggested **Remediation**.
Details panel for a Snyk IaC configuration issue
---
# Source: https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/visual-studio-code-extension/view-analysis-results-from-visual-studio-code-extension/analysis-results-snyk-open-source.md
# Analysis results: Snyk Open Source
Snyk Open Source analysis shows vulnerabilities in your code with every scan. The scan runs in the background and is enabled by default.
In the **Problems** tab of the Open Source editor window, you can see all vulnerabilities found in your Project.
## Snyk Open Source editor window
The editor window shows security vulnerabilities in open-source modules while you code in JavaScript, TypeScript, and HTML. Receive feedback in line with your code, such as how many vulnerabilities a module that you are importing contains. The editor exposes only top-level dependency vulnerabilities; for the full list of vulnerabilities, refer to the side panel.
You can find security vulnerabilities in the npm packages you import and see the known vulnerabilities in your imported npm packages as soon as you require the information:
Vulnerabilities in npm package
Code inline vulnerability counts are also shown in your `package.json` file:
Results screen showing the vulnerability count
You can find security vulnerabilities in your JavaScript packages from well-known Content Delivery Networks (CDNs). The extension scans any HTML files in your Projects and displays vulnerability information about the modules you include from your favorite CDN.
The following CDNs are supported:
* unpkg.com
* ajax.googleapis.com
* cdn.jsdelivr.net
* cdnjs.cloudflare.com
* code.jquery.com
* maxcdn.bootstrapcdn.com
* yastatic.net
* ajax.aspnetcdn.com
Vulnerability from a CDN
You can navigate to the most severe vulnerability by triggering the provided code actions. This opens a vulnerability window to show more details:
Code actions
## Snyk Open Source vulnerability window
The Open Source Security (OSS) vulnerability window shows information about the vulnerable module.
* Links to external resources (CVE, CWE, Snyk Vulnerability DB) to explain the vulnerability in more detail
* CVSS score and exploit maturity
* Detailed path of how vulnerability is introduced to the system
* Summary of the vulnerability, with the remediation advice to fix it
Snyk Open Source vulnerability window
---
# Source: https://docs.snyk.io/manage-risk/analytics.md
# Analytics
{% hint style="info" %}
Legacy Reports (in the Snyk Web UI) and the Reporting API v1 have been deprecated. They will be removed from the product on April 27, 2026. This extended timeline is designed to give your teams ample time to assess your usage and migrate to the new solution. View the API Migration guide for help transitioning to the newer [Export API](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/export-api-specifications-columns-and-filters).
{% endhint %}
Analytics provides executives, as well as Application Security (AppSec) leaders and practitioners a view into the performance of their AppSec program. Snyk customers can understand at a glance the strengths and weaknesses of their program, identify where successful practices can be discerned, and uncover the largest opportunities for improvement that warrant investment. Analytics is available at the tenant level.
{% hint style="info" %}
To access Analytics, you need to have one of the following [tenant roles](https://docs.snyk.io/snyk-platform-administration/groups-and-organizations/tenant/manage-users-in-a-tenant): Tenant Admin, Tenant Viewer.
{% endhint %}
The Analytics view is structured as follows:
* [Issues Analytics](https://docs.snyk.io/manage-risk/analytics/issues-analytics) - provides the exposure and performance details of Snyk issues in Groups and Organizations while focusing on the issue introduction method (baseline, preventable, or non-preventable).
* [Application Analytics](https://docs.snyk.io/manage-risk/analytics/application-analytics) - provides data analytics for reviewing and comparing assets and issues metrics at the level of asset classes, applications, or code owners.
The following table presents an overview of the features available for both Issues Analytics and Application Analytics.
| Issues Analytics | Application Analytics |
| ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| Data filtered by default on critical and high-severity issues. Drill down to see the way that issues were introduced. Issues framework: categorized based on Exposure, Manage, Prevention, and Coverage. | Data filtered based on assets, applications, and code owners (teams). Helps you to identify and take action on risk, coverage gaps, and association gaps. Asset class view Application and owner view Surface coverage gap Comparison and prioritization |
{% hint style="info" %}
The specific features and availability of both products may vary as they continue to evolve. For the latest information, refer to the respective product documentation.
{% endhint %}
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-container/use-snyk-container/analyze-and-fix-container-images.md
# Analyze and fix container images
You can import container Projects into Snyk using the CLI command [`snyk container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor). Alternatively, you can import Projects directly from a supported container registry using the Snyk Web UI.
Snyk imports a snapshot of your container image and then scans the snapshot for vulnerabilities. Based on your configuration, daily or weekly, Snyk regularly scans the dependencies snapshotted in that original import, which in turn refers to its tag. Based on your configuration, Snyk sends you an update by email or Slack when any new vulnerabilities are identified.
If the tag for an image changes and the original tag is used for a different image, upon rescanning (daily or weekly), Snyk detects changes to the Linux package dependencies and creates a new snapshot of that Project, but it does not detect changes in the application dependencies and thus does not update the snapshot for application vulnerabilities.
This means that if you frequently reuse a tag to refer to a different image, you must reimport the other image so that Snyk can update the application dependencies.
## Grouping of Container Projects
Depending on how you import images (Snyk CLI, container registry integration, or Kubernetes integration), Projects are grouped differently in the **Projects** tab.
### Project grouping when importing images with Snyk CLI
Snyk groups images and the applications found in the image. However, Snyk CLI does not use image tags for grouping, so Snyk does not do sub-grouping for the different image tags. Thus images from the same repository with different image tags are all grouped.
### Project grouping when importing images with container registry integration
If you import images into Snyk with container registry integration, in the **Projects** list Snyk performs sub-grouping per image tag for each image name.
Images with different image tags grouped in sub-groups
### Project grouping when importing images with the Kubernetes integration
If you import images into Snyk using the Kubernetes integration, the top clickable item represents the workload in the cluster. Snyk performs grouping based on the image in the workload, without sub-grouping per image tag.
## View image vulnerabilities
If the Project is imported from a registry integration, on the **Projects** page, it is marked with the relevant registry icon. If the Project is imported from the CLI, it is marked with a CLI icon. You can also filter to display all container Projects.
When you open a container Project, the analysis and fix advice appear for that Project.
Analysis and fix advice for a container Project
The following information is displayed:
* Project summary: general Project details, including unique details:
* **Image ID** - derived from the container image digest
* **Image tag**
* **Base Image**
* Total dependencies with known vulnerabilities and the total number of vulnerabilities
* Fix advice: If you included your Dockerfile for monitoring, available and actionable fix advice is displayed. To view all fix advice, click **Show more upgrade types**.
The **Issues** tab provides a list of vulnerabilities, including origins and paths, as well as an overview of the vulnerabilities.
In the issues list, you can use the filters available for all supported Project types, as well as the following filters:
* **OS BINARIES** - to see specific binary or OS packages for binaries and packages containing issues.
* **IMAGE LAYER** - to see Dockerfile instructions. If you attach a Dockerfile, you can filter to view issues associated only with the base image, view Dockerfile-related advice (user instruction), or both.
Filters for OS binaries and image layers
{% hint style="info" %}
The **OS BINARIES** filter does not appear if there is only one category of issues in your container, for example, Node binary vulnerabilities or OS packages.\\
The **IMAGE LAYER** filter does not appear if there is no Dockerfile attached.
{% endhint %}
The **Dependencies** tab provides a tree view of the package hierarchy inside the image.
## Fix image vulnerabilities
When providing public base image recommendations, Snyk bases its logic on the origin repo, flavor, and version of the base image it detects.
The Snyk recommendations for upgrading the base image refer to:
* **Minor upgrades**: the safest and best minor upgrade available
* **Major upgrades**: an option for a major upgrade that will reduce more vulnerabilities but with greater risk
* **Alternative upgrades**: viable image options for replacing your current base image with possible different base images that provide the least amount of vulnerabilities.
* Recommendation to rebuild your base image if it is outdated.
Recommendations for upgrading the base image include:
* The name of the recommended base image version
* The number of vulnerabilities that exist in the recommended upgrade
* A summary of the vulnerability severities.
Recommendations for upgrading the base image
---
# Source: https://docs.snyk.io/scan-with-snyk/pull-requests/pull-request-checks/analyze-pr-checks-results.md
# Analyze PR checks results
## PR checks results
After you [submit a pull request to fix vulnerabilities](https://docs.snyk.io/scan-with-snyk/snyk-open-source/manage-vulnerabilities/fix-your-vulnerabilities), PR Checks detects issues with a severity level that meets or exceeds your configured threshold and provides a report. Examine the report status and result to decide whether to merge the pull request.
You can change the default severity threshold either at the [Integration level](https://docs.snyk.io/scan-with-snyk/pull-requests/configure-pull-request-checks#configure-pr-checks-at-the-integration-level), or at the [Project level](https://docs.snyk.io/scan-with-snyk/pull-requests/configure-pull-request-checks#configure-pr-checks-at-the-project-level).
## Result status
Check the status of the PR Checks results in the integrated SCM to identify security issues that need to be addressed before merging a pull request.
The following status indicators can appear for your Snyk PR checks in the integrated SCM:
Result status Description Success/Passed No issues were discovered and the manifest file was not changed. Pending The PR Checks are still running. Failed/Issues found Security issues were identified in the pull request. In this scenario, you need to manually set the result status to Passed . Error Out-of-sync package.json and package.lock files, failure to find or to read the manifest file. Canceled The test limit has been reached.
{% hint style="info" %}
For false positive or false negative results, see [Troubleshooting PR Checks](https://docs.snyk.io/scan-with-snyk/pull-requests/pull-request-checks/troubleshoot-pr-checks).
{% endhint %}
## Example: fix dependency issues with PR checks
Consider the following end-to-end scenario, including specific actions such as triggering a Fix PR and marking a **Failed** result as **Passed**. You can take these actions in relation to the information provided by the PR Checks. This example shows taking the steps for a [GitHub integration](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations/github) as follows:
1. [Trigger a fix for an individual dependency](#trigger-a-fix-for-an-individual-dependency) to remediate that version's vulnerabilities.
2. [Open a Fix PR](#open-a-fix-pr) to open a pull request in GitHub.
3. [Analyze PR Checks results and set status](#analyze-pr-checks-result-and-set-status) to merge the pull request.
{% hint style="info" %}
Before you begin, check the [Prerequisites for automated PR Checks](https://docs.snyk.io/scan-with-snyk/pull-requests/configure-pull-request-checks#prerequisites-for-automated-pr-checks) to make sure you have Snyk configured and the role defined.
{% endhint %}
### Trigger a fix for an individual dependency
1. Log in to the Snyk Web UI.
2. Navigate to **Projects**.
3. Expand the target containing your Project.
4. Click a Project name to open it and select **package.json** to check for open-source and licensing issues.
5. In the **Issues** tab, find the dependency or specific vulnerability and, if a fix is available, click the **Upgrade to X.X.X** button at the bottom of the card and select **Fix this vulnerability**. For example, the jsonwebtoken can be upgraded from version 0.4.0 to version 5.0.0, fixing a number of vulnerabilities.
Updating a dependency to remediate the Authentication Bypass issue and others found in version 5.0.0.
6. (Optional) Select **Fix these vulnerabilities** at the top of the page to fix all dependency vulnerabilities with one pull request.
### Open a Fix PR
Confirm your selected issue and click **Open a Fix PR** to open a pull request in the GitHub integration.
Triggering a Fix PR for an individual issue in the dependencies project
### Analyze PR checks result and set status
1. (Optional) Examine the pull request generated by [Snyk Bot](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations/github#commit-signing) in the Conversation tab in GitHub.
2. Find the conversation card showing the PR Checks results. For this example, the result is set to **Failed** and is manually changed to **Passed**.
{% hint style="info" %}
Issues that have previously been ignored via the Snyk Web UI in the associated Open Source or code analysis Project are not flagged in these checks. This reflects [ignored issues](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/ignore-issues) across feature branch PRs.
{% endhint %}
PR Checks card in the Conversations tab, GitHub
3. Expand list of files that have been checked for this issue.
4. (Optional) Click **View test page** to examine the issue details.\
\
You can get a complete picture of the vulnerability by clicking **Show more detail** for technical security information and remediation options.\
\
To return to the main issue page, click **Project**.
Overview of PR Checks result
5. **Mark as successful in SCM** to change the result status and merge the pull request with failed security issues.
Marking PR Checks result as successful
{% hint style="warning" %}
Marking a vulnerability as successful does not ignore the issue but only allows the security checks for the PR to pass in this current branch. If the issue is not fixed, it shows up in future commits and PR Checks after you merge it with the target branch.
{% endhint %}
The issue is marked as **Passed** and shows up as **Skipped** in the PR Checks card in GitHub.
## SCM integrations
### GitLab
Snyk sets the status on a merge request's latest pipeline based on scan results and the project's CI/CD configuration for merged results, merge requests, and branch pipelines. This feature blocks merge requests with security issues when the "Pipelines must succeed" [setting](https://docs.gitlab.com/user/project/merge_requests/auto_merge/#require-a-successful-pipeline-for-merge) is enabled.
## Troubleshooting PR checks
[Troubleshooting PR Checks](https://docs.snyk.io/scan-with-snyk/pull-requests/pull-request-checks/troubleshoot-pr-checks) has more information on how to troubleshoot PR checks or how to restart them.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-implementation-guide/phase-5-initial-rollout-to-team/announcement-templates-for-initial-rollout.md
# Announcement templates for initial rollout
You can use these templates to communicate the Snyk rollout to the rest of the developers. Update the text in brackets with your own details, then send the message to the developers.
## Email template
| |
| ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| To: Developers
Subject: Launching Snyk at \[Company name]
Hi all,
I’m excited to announce that we’re implementing Snyk at \[Company name]
\[optional: add personalized video, if desired]
Snyk will help us \[enter your goal(s)].
As part of the launch process, we’ll invite you to a short “Intro to Snyk” and Q\&A session to learn more about Snyk and the products we’re implementing. You’ll also have the opportunity to attend a developer training session and get access to Snyk Learn for self-paced tutorials to help you get started.
We’re looking forward to building secure applications together, with less frustration and interruption to your workflows for addressing security issues.
More info can be found at \[hyperlink to your internal resource page/wiki with more info].
Regards,
\_\_\_\_\_ \[Sender]
|
## Instant messaging template
| |
| ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Snyk Dev-First Security Initiative: You’re invited to a brief “Intro to Snyk” and Q\&A session on \[insert date, time, Zoom info]. You’ll also have the opportunity to attend a developer training session on \[insert date/registration detail link] and get access to Snyk Learn \[hyperlink] for self-paced tutorials to help you get started. |
###
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-implementation-guide/phase-6-rolling-out-the-prevention-stage/announcement-templates-for-prevention.md
# Announcement templates for prevention
This page provides example email and Slack message templates that you can use to introduce prevention tools to your teams.
## Introduce prevention features to your developers
It is important that your development teams understand what changes are being made that may affect their day-to-day work. Ensure they understand how the prevention tests work, to help avoid surprises from possible issues that could affect their deadlines.
{% hint style="info" %}
These examples are written based on the Snyk tests on the PR Checks feature, with the configuration set to fail only on High or Critical severity issues. If you are adding Snyk tests to your CI/CD pipelines, ensure that you tweak the messages.
{% endhint %}
Use the following template to communicate the Snyk rollout to the rest of the developers. Update the text in brackets with your details, and then send the message to the developers.
## Email template
| |
| ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| To: Developers
Subject: Introducing Snyk tests to PRs \[Company name]
Hi all,
As part of our ongoing aim to improve our application security at \[Company name], we are preparing to start running Snyk tests against all new pull requests for any repository that has been imported into Snyk.
\[optional: add personalized video, if desired]
These checks will identify any new High or Critical severity issues that are part of the PR, with the aim of preventing any new significant issues from entering our repositories. At first, these checks will be optional, meaning you are not blocked from merging a PR if one of these vulnerabilities is detected.
In the future, this will be changing to a blocking check, so we would recommend you start remediating any new High or Critical issues that are detected in your PRs, so that you aren’t affected when the test is no longer optional.
This change will make a huge difference in improving our application security, and by gradually introducing this feature, we hope to avoid any interruptions to your workflow.
More info can be found at \[hyperlink to your internal resource page/wiki with more info].
Regards,
\_\_\_\_\_ \[Sender]
|
## Slack message template
| |
| ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| *Snyk Tests being introduced to our PRs: From \[date] we’ll be enabling a feature in Snyk so that all new PRs on repositories that have been imported to Snyk will be tested for new vulnerabilities. You’ll see the test will fail if any new High or Critical severity issues are found. Please fix these before merging if possible! For now, the tests are optional, so you can merge the PR even if the test fails, but in the future, we’ll be setting this to be a required check. Get in touch if you have any questions!* |
---
# Source: https://docs.snyk.io/integrations/snyk-studio-agentic-integrations/quickstart-guides-for-snyk-studio/antigravity-guide.md
# Antigravity guide
Add Snyk Studio to Google Antigravity to secure code generated with agentic workflows through a Large Language Model (LLM). This can be achieved in several ways. When you use it for the first time, Snyk Studio will ask for trust and trigger authentication if necessary.
### Install Antigravity
Visit the [Google Antigravity](https://antigravity.google/) website to download the correct version of the IDE.
### Install using the Snyk Studio plugin
{% hint style="warning" %}
The Snyk MCP server cannot be manually installed. Use the Snyk Security plugin in Google Antigravity.
{% endhint %}
* Click [this link](antigravity:extension/snyk-security.snyk-vulnerability-scanner) to open the Snyk Security plugin in Google Antigravity directly.
* Click **Install**.
If asked to trust the publisher, select **Trust Publisher & Install.**
Popup in Snyk Security plugin asking for trust verification
## Set up Snyk Studio
{% hint style="info" %}
As a one-time setup, you may need to authenticate and trust the current Project directory. If required, the agentic workflow will likely manage this automatically.
{% endhint %}
### Enable "Secure At Inception"
On installation completion, a modal prompts you to opt in to Snyk Studio's "[Secure at inception](https://docs.snyk.io/discover-snyk/getting-started/glossary#secure-at-inception)." This action automatically configures the necessary rules to scan any new AI generated code. Additional options are available on the **Settings** page for the plugin.
Modal prompting you to opt in to Secure at inception
### Authenticate
When you have made a selection regarding Secure at inception, you will be asked to authenticate. You can authenticate at two points in this process:
* Immediately after plugin install
* Before your first Snyk code scan
As part of the authentication flow, you will be asked to either sign up or sign in on the Snyk website. A browser window will open.
For new users, select the preferred sign up method and agree to the terms on the next screen. On successful authentication, you will be instructed to return to your IDE.
{% hint style="info" %}
To use Snyk Studio, specifically Snyk's SAST scanning capabilities, you need to enable [Snyk Code](https://docs.snyk.io/scan-with-snyk/snyk-code). Snyk Code analyzes your code for vulnerabilities and temporarily clones the repository and or uploads your code. Cloned or uploaded code is cached according to our [data retention policy](https://docs.snyk.io/snyk-data-and-governance/how-snyk-handles-your-data). With the Snyk Free Plan, Snyk Code offers unlimited scans for open source projects, and limited tests for 1st-party code. For more details, visit [Plans and Pricing](https://snyk.io/plans/).
{% endhint %}
For existing users, select the login method associated with your account. If you do not have access to Snyk Code, your LLM will prompt you to enable it prior to your first scan. You can also [enable it directly in Snyk's Settings](https://docs.snyk.io/implementation-and-setup/enterprise-implementation-guide/phase-2-configure-account/set-visibility-and-configure-an-organization-template/enable-snyk-code).
{% hint style="info" %}
If you are enabling Snyk Code for the first time, you must import or re-import existing projects in order to properly scan them.
{% endhint %}
### Run Snyk Studio
On authentication, Snyk Studio will be triggered whenever new code is generated by the LLM. If Snyk Studio is not enabled, restart your IDE and try generating code again.
{% hint style="info" %}
Free users are limited to a set number of scans. If you reach the allotment, we recommend [reaching out to sales](https://snyk.io/contact-us/) to unlock additional thresholds.
{% endhint %}
## Examples
### Scanning for security vulnerabilities
In the chat, you can tag specific files or functions and ask \[add MCP guide name] to scan your code for safely deploying to production:
{% code title="prompt" overflow="wrap" %}
```
Scan this directory for code security & dependency vulnerabilities and security issues
```
{% endcode %}
Google Antigravity indicates that this request is related to security vulnerability scanning and calls the Snyk MCP Server for various scans.
Google Antigravity is then able to gather all the security scanning results, both code (SAST) and dependencies (SCA), and based on recommendations and contextual information provided by the Snyk MCP Server, it is now able to provide code changes that address some or all of the security vulnerabilities that were found.
### "Secure at inception" rules
To practice [Secure at inception](https://docs.snyk.io/discover-snyk/getting-started/glossary#secure-at-inception), Snyk recommends adopting rules that align the LLM for secure code generation and workflows.
The following is a suggested instruction for Google Antigravity's rules if you choose to implement on your own instead of controlling then through Snyk's Secure At inception setting. Add these to your environment's global rules file or you must add them again in every project.
{% code title="Antigravity rule" overflow="wrap" %}
```
Always run Snyk Code scanning tool for new first party code generated.
Always run Snyk SCA scanning tool for new dependencies or dependency updates.
If any security issues are found based on newly introduced or modified code or dependencies, attempt to fix the issues using the results context from Snyk.
Rescan the code after fixing the issues to ensure that the issues were fixed and that there are no newly introduced issues.
Repeat this process until no issues are found.
```
{% endcode %}
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-code/snyk-code-security-rules/apex-rules.md
# Apex rules
Each rule includes the following information.
* **Rule Name**: The Snyk name of the rule.
* **CWE(s):** The [CWE numbers](https://cwe.mitre.org/) that are covered by this rule.
* **Security Categories**: The [OWASP Top 10 ](https://owasp.org/Top10/)(2021 edition) category to which the rule belongs to, if any, and if it is included in [SANS 25](https://www.sans.org/top25-software-errors/).
* **Autofixable**: Security rules that are autofixable by Snyk Agent Fix. This information is included only for the supported programming languages.
| Rule Name | CWE(s) | Security Categories | Autofixable |
| ------------------------------------------------------------ | ---------------- | ---------------------- | ----------- |
| Access Violation | CWE-284, CWE-285 | OWASP:A01 | Yes |
| Clear Text Sensitive Storage | CWE-200, CWE-312 | OWASP:A01, OWASP:A04 | No |
| Command Injection | CWE-78 | Sans Top 25, OWASP:A03 | No |
| Improper Access Control: Email Content Injection | CWE-284 | OWASP:A01 | No |
| Use of Hardcoded Credentials | CWE-798, CWE-259 | Sans Top 25, OWASP:A07 | No |
| Use of Hardcoded Passwords | CWE-798, CWE-259 | Sans Top 25, OWASP:A07 | No |
| Hardcoded Secret | CWE-547 | OWASP:A05 | No |
| Use of Password Hash With Insufficient Computational Effort | CWE-916 | OWASP:A02 | Yes |
| Insecure Data Transmission | CWE-319 | OWASP:A02 | No |
| Open Redirect | CWE-601 | OWASP:A01 | No |
| Cross-site Scripting (XSS) | CWE-79 | Sans Top 25, OWASP:A03 | No |
| Regular expression injection | CWE-400, CWE-730 | None | No |
| SOQL Injection | CWE-89 | Sans Top 25, OWASP:A03 | No |
| SOSL Injection | CWE-89 | Sans Top 25, OWASP:A03 | No |
| Server-Side Request Forgery (SSRF) | CWE-918 | Sans Top 25, OWASP:A10 | No |
| Unverified Password Change | CWE-620 | OWASP:A07 | No |
| Unsafe SOQL Concatenation | CWE-89 | Sans Top 25, OWASP:A03 | No |
| Unsafe SOSL Concatenation | CWE-89 | Sans Top 25, OWASP:A03 | No |
| Sensitive Cookie in HTTPS Session Without 'Secure' Attribute | CWE-614 | OWASP:A05 | Yes |
| XML Injection | CWE-91 | OWASP:A03 | No |
---
# Source: https://docs.snyk.io/supported-languages/supported-languages-list/apex.md
# Apex
## Applicability and integration
{% hint style="info" %}
Apex is supported only for Snyk Code.
{% endhint %}
Available integrations:
* SCM import
* CLI and IDE: test or monitor your app. For more information, see [Snyk CLI for Snyk Code](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/snyk-cli-for-snyk-code).
## Technical specifications
### Supported file formats
Apex Standard Library is fully supported. Snyk supports the following file formats for Snyk Code: `.cls`, `.trigger`, `.tgr`
### Supported features
Snyk supports the following features for Apex:
* Support for Interfile analysis
* Custom rules
* Reports
* Interfile analysis
{% hint style="info" %}
The **Snyk fix PR** feature is not available for Apex. This means that you will not be notified if the PR checks fail when the following conditions are met:
* The **PR checks** feature is enabled and configured to **Only fail when the issues found have a fix available.**
* "**Fixed in" available** is set to **Yes.**
{% endhint %}
---
# Source: https://docs.snyk.io/snyk-api/api-end-of-life-eol-process-and-migration-guides.md
# API End of Life (EOL) process and migration guides
This page explains the process, key dates, and milestones associated with the end-of-life (EOL) cycle for all API endpoints. In this documentation, you will also find detailed information about [key dates](https://docs.snyk.io/snyk-api/api-end-of-life-eol-process-and-migration-guides/api-eol-endpoints-and-key-dates) and [migration guides](https://docs.snyk.io/snyk-api/api-end-of-life-eol-process-and-migration-guides/guides-to-migration) for API endpoints that are in the end-of-life process.
## API End of Life (EOL) process
Snyk GA REST APIs are an evolution of Snyk V1 APIs because the GA REST APIs have the following:
* Consistent versioning
* Pagination and caching
* Improved performance
* Specifications for client generation
Snyk EOL for V1 APIs and replacement with GA REST APIs will implement this improvement over the equivalent V1 APIs. As Snyk delivers more GA REST APIs, experimental and beta versions of the REST API will also reach end-of-life.
Migrating from V1 API to GA REST can be a time-consuming process, and Snyk wants to ensure that you have enough time to factor in and execute migrations so that you can have the best API experience as soon as possible. The process for taking an endpoint (V1, experimental, or beta API) to EOL for seamless migration to GA REST is as follows:
1. Batches of endpoints will be part of an EOL cycle that begins twice a year: one batch in January and one batch in July.
2. API endpoints can be included for EOL only if they have:
* A GA REST equivalent or equivalents (except in the rare case where a V1 API does not have or need a GA REST equivalent)
* Functionality parity between V1 and GA REST (unless explicitly stated otherwise in the migration guide)
* A migration guide by our field specialists for ease of migration
3. Snyk will [publicly announce](http://updates.snyk.io/) which endpoints will be part of an EOL cycle one month before the cycle begins.
4. On the date the EOL begins, the endpoints are deemed **deprecated**. At that point, the documentation of each endpoint will either be **removed** or have a statement added that the endpoint is deprecated. In addition, no new customers will be able to integrate with the endpoint. The endpoint will remain functional for existing customers until the end-of-life date. You can find all of the endpoints reaching end of life and the associated timelines on the [API EOL endpoints and key dates](https://docs.snyk.io/snyk-api/api-end-of-life-eol-process-and-migration-guides/api-eol-endpoints-and-key-dates) page.
5. On a monthly basis during the EOL cadence, Snyk will temporarily halt functionality for the nominated endpoints for a period of time, increasing in duration over the course of the EOL.
6. When we reach the EOL date, the endpoint will stop working, and you will receive an error.
## Types of API EOL
The following types of EOL will take place during each cycle:
1. V1 API: When a GA REST equivalent or equivalents are released, Snyk will aim to include the V1 API in an end-of-life cycle as soon as possible. Users will have a six-month window to migrate off the endpoint, beginning at the public announcement in January and July.
2. Experimental and beta: When Snyk upgrades a REST endpoint to GA that has earlier experimental or beta endpoints or both, Snyk will aim to include them in the EOL cycle as soon as possible. Users will have a 3-month window to migrate off the endpoint (starting from the public announcement in January and July).
In exceptional circumstances, Snyk may have to announce an EOL for an endpoint outside of the two announcements each year in January and July. Users will receive a one-month notice of the EOL cycle. The time window to migrate off the endpoint will follow the same windows identified for each type of EOL: a V1 API or an experimental or beta endpoint.
---
# Source: https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips.md
# API endpoints index and tips
{% hint style="info" %}
**How to find your `org_id`**\
Log in to Snyk, navigate to your **Organization**, and then to your **Settings** > **General**. The **Organization ID** is on the General settings page, and you can copy it.
{% endhint %}
This index and notes section of the documentation provides, in addition to this index, [solutions for specific use cases](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/solutions-for-specific-use-cases), [scenarios for using Snyk APIs](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/scenarios-for-using-the-snyk-api), and pages with detailed information about using Snyk API endpoints:
* [Organization and Group identification for Projects using the API](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/organization-and-group-identification-for-projects-using-the-api)
* [Project issue paths V1 API endpoints](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/project-issue-paths-api-endpoints)
* [Project type responses from the API](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/project-type-responses-from-the-api)
See also the following sections on specific APIs:
* [How to use Snyk Apps APIs](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis)
* [How to use Snyk SBOM and List issues APIs](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/sbom-apis)
* [How to use Snyk webhooks APIs](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/webhooks-apis)
For additional information, see the [API support articles](https://support.snyk.io/s/topic/0TOPU00000BgWMv4AN/snyk-api).
This index includes the categories and names of REST GA and beta and V1 API endpoints, with the URL in the reference docs for each endpoint, and links to related information where available. REST is the default, and GA is the status unless beta is noted. V1 API is specified where applicable. This index is a work in progress; additional information is being added continually.
## AccessRequests (beta)
### [Get access requests](https://apidocs.snyk.io/?beta=\&version=2024-10-15#get-/self/access_requests)
## Apps
**More information:** [Snyk Apps](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis)
### [Get a list of apps that act on your behalf](https://docs.snyk.io/reference/apps#self-apps)
### [Revoke an app](https://docs.snyk.io/reference/apps#self-apps-app_id)
### [Get a list of active OAuth sessions for the app](https://docs.snyk.io/reference/apps#self-apps-app_id-sessions)
### [Revoke an active user app session](https://docs.snyk.io/reference/apps#self-apps-app_id-sessions-session_id)
### [Get a list of apps installed for a user](https://docs.snyk.io/reference/apps#self-apps-installs)
### [Revoke access for an app by install ID](https://docs.snyk.io/reference/apps#self-apps-installs-install_id)
**Replaces:** DEPRECATED Revoke app bot authorization
### DEPRECATED [Create a new app for an organization](https://docs.snyk.io/reference/apps#orgs-org_id-apps)
**Replaced by:** Create a new Snyk App for an organization
**More information:** [Create a Snyk App using the Snyk API](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis/create-a-snyk-app-using-the-snyk-api)
### [Get a list of apps created by an organization](https://docs.snyk.io/reference/apps#orgs-org_id-apps-1)
**Replaces:** DEPRECATED Get a list of apps created by an organization
**More information:** [Manage App details](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis/manage-app-details)
### DEPRECATED [Update app attributes that are name, redirect URIs, and access token time to live](https://docs.snyk.io/reference/apps#orgs-org_id-apps-client_id)
**Replaced by:** Update app creation attributes such as name, redirect URIs, and access token time to live using the App ID
### DEPRECATED [Get an app by client id](https://docs.snyk.io/reference/apps#orgs-org_id-apps-client_id-1)
**Replaced by:** Get a Snyk App by its App ID
### DEPRECATED [Delete an app](https://docs.snyk.io/reference/apps#orgs-org_id-apps-client_id-2)
**Replaced by:** Delete a Snyk App by its App ID
### DEPRECATED [Manage client secrets for an app](https://docs.snyk.io/reference/apps#orgs-org_id-apps-client_id-secrets)
**Replaced by:** Manage client secret for non-interactive Snyk App installations
### [Install a Snyk App to this organization](https://docs.snyk.io/reference/apps#orgs-org_id-apps-installs)
### [Get a list of apps installed for an organization](https://docs.snyk.io/reference/apps#orgs-org_id-apps-installs-1)
**Replaces:** DEPRECATED Get a list of app bots authorized to an organization
**More information:** [Slack app (Jira integration)](https://docs.snyk.io/integrations/jira-and-slack-integrations/slack-app) (Find the Slack App Bot ID)
### [Revoke app authorization for a Snyk organization](https://docs.snyk.io/reference/apps#orgs-org_id-apps-installs-install_id)
**See also:** Revoke app authorization for a Snyk Group with install ID
### [Manage client secret for non-interactive Snyk App installations](https://docs.snyk.io/reference/apps#orgs-org_id-apps-installs-install_id-secrets)
**More information:** [Manage App details](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis/manage-app-details)
### [Create a new Snyk App for an organization](https://docs.snyk.io/reference/apps#orgs-org_id-apps-creations)
**Replaces:** DEPRECATED Create a new app for an organization
**More information:** [Create a Snyk App using the Snyk API](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis/create-a-snyk-app-using-the-snyk-api)
### DEPRECATED [Get a list of apps created by an organization](https://docs.snyk.io/reference/apps#orgs-org_id-apps-creations-1)
**Replaced by:** Get a list of apps created by an organization
### [Update app creation attributes such as name, redirect URIs, and access token time to live using the App ID](https://docs.snyk.io/reference/apps#orgs-org_id-apps-creations-app_id)
**Replaces:** DEPRECATED Update App attributes that are name, redirect URIs, and access token time to live
**More information:** [Manage App details](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis/manage-app-details)
### [Get a Snyk APP by its App ID](https://docs.snyk.io/reference/apps#orgs-org_id-apps-creations-app_id)
**Replaces:** DEPRECATED Get an app by client id
### [Delete an app by its App ID](https://docs.snyk.io/reference/apps#orgs-org_id-apps-creations-app_id-2)
**Replaces:** DEPRECATED Delete an app
**More information:** [Manage App details](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis/manage-app-details)
### [Manage client secret for the Snyk App](https://docs.snyk.io/reference/apps#orgs-org_id-apps-creations-app_id-secrets)
**More information:** [Manage App details](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis/manage-app-details)
### DEPRECATED [Get a list of app bots authorized to an organization](https://docs.snyk.io/reference/apps#orgs-org_id-app_bots)
**Replaced by:** [Get a list of apps installed for an organization](https://apidocs.snyk.io/?#get-/orgs/-org_id-/apps/installs)
**More information:** [Slack app](https://docs.snyk.io/integrations/jira-and-slack-integrations/slack-app) (for Jira integration)
### DEPRECATED [Revoke app bot authorization](https://docs.snyk.io/reference/apps#orgs-org_id-app_bots-bot_id)
**Replaced by:** Revoke app authorization for a Snyk Group with install ID
**See also:** [Revoke access for an app by install](https://apidocs.snyk.io/?#delete-/self/apps/installs/-install_id-)
### [Install a Snyk App to this group](https://docs.snyk.io/reference/apps#groups-group_id-apps-installs)
### [Get a list of apps installed for a group](https://docs.snyk.io/reference/apps#groups-group_id-apps-installs-1)
### [Revoke app authorization for a Snyk Group with install ID](https://docs.snyk.io/reference/apps#groups-group_id-apps-installs-install_id)
### [Manage client secret for non-interactive Snyk App installations](https://docs.snyk.io/reference/apps#groups-group_id-apps-installs-install_id-secrets)
**Replaces:** DEPRECATED Manage client secrets for an app
## Audit Logs
**More information**: [Retrieve audit logs of user-initiated activity by API for an Org or Group](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/retrieve-audit-logs-of-user-initiated-activity-by-api-for-an-org-or-group);\
[AWS CloudTrail Lake](https://docs.snyk.io/integrations/event-forwarding/aws-cloudtrail-lake)
### [Search Organization audit logs](https://docs.snyk.io/reference/audit-logs#orgs-org_id-audit_logs-search)
**More information:** [Retrieve audit logs of user-initiated activity by API for an Org or Group](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/retrieve-audit-logs-of-user-initiated-activity-by-api-for-an-org-or-group), [AWS CloudTrail Lake](https://docs.snyk.io/integrations/event-forwarding/aws-cloudtrail-lake)
### [Search Group audit logs](https://docs.snyk.io/reference/audit-logs#groups-group_id-audit_logs-search)
**More information:** [Filter through your audit logs more efficiently with the new GA REST version of the audit logs API](https://updates.snyk.io/filter-through-your-audit-logs-more-efficiently-with-the-new-ga-rest-version-of-the-audit-logs-api-and-api-access-is-now-opt-in-291850) (product update); [Retrieve audit logs of user-initiated activity by API for an Org or Group](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/retrieve-audit-logs-of-user-initiated-activity-by-api-for-an-org-or-group)
## Audit logs (v1)
### Group level audit logs
Use [Search Group audit logs](https://docs.snyk.io/reference/audit-logs#groups-group_id-audit_logs-search)
### Organization level audit logs
Use [Search Organization audit logs](https://docs.snyk.io/reference/audit-logs#orgs-org_id-audit_logs-search)
## Cloud (beta)
### [List Environments](https://apidocs.snyk.io/?beta=\&version=2024-10-15#get-/orgs/-org_id-/cloud/environments)
### [Create New Environment](https://apidocs.snyk.io/?beta=\&version=2024-10-15#post-/orgs/-org_id-/cloud/environments)
### [Delete Environment](https://apidocs.snyk.io/?beta=\&version=2024-10-15#delete-/orgs/-org_id-/cloud/environments/-environment_id-)
### [Update Environment](https://apidocs.snyk.io/?beta=\&version=2024-10-15#patch-/orgs/-org_id-/cloud/environments/-environment_id-)
### [Generate Cloud Provider Permissions](https://apidocs.snyk.io/?beta=\&version=2024-10-15#post-/orgs/-org_id-/cloud/permissions)
### [List Resources](https://apidocs.snyk.io/?beta=\&version=2024-10-15#get-/orgs/-org_id-/cloud/resources)
[Snyk IaC](https://docs.snyk.io/scan-with-snyk/snyk-iac) (Use: View an inventory of IaC and cloud resources generated from your IaC files)
### [List Scans](https://apidocs.snyk.io/?beta=\&version=2024-10-15#get-/orgs/-org_id-/cloud/scans)
### [Create Scan](https://apidocs.snyk.io/?beta=\&version=2024-10-15#post-/orgs/-org_id-/cloud/scans)
### [Get scan](https://apidocs.snyk.io/?beta=\&version=2024-10-15#get-/orgs/-org_id-/cloud/scans/-scan_id-)
## Collection
The View Project History permission is needed to use this API.
**More information:** [Project collections groupings](https://docs.snyk.io/snyk-platform-administration/snyk-projects/project-collections-groupings)
### [Create a collection](https://docs.snyk.io/reference/collection#orgs-org_id-collections)
### [Get collections](https://docs.snyk.io/reference/collection#orgs-org_id-collections-1)
### [Edit a collection](https://docs.snyk.io/reference/collection#orgs-org_id-collections-collection_id)
### [Get a collection](https://docs.snyk.io/reference/collection#orgs-org_id-collections-collection_id-1)
### [Delete a collection](https://docs.snyk.io/reference/collection#orgs-org_id-collections-collection_id-2)
### [Add projects to a collection](https://docs.snyk.io/reference/collection#orgs-org_id-collections-collection_id-relationships-projects)
### [Get projects from the specified collection](https://docs.snyk.io/reference/collection#orgs-org_id-collections-collection_id-relationships-projects-1)
### [Remove projects from a collection](https://docs.snyk.io/reference/collection#orgs-org_id-collections-collection_id-relationships-projects-2)
## ContainerImage
### [List instances of container image](https://docs.snyk.io/reference/containerimage#orgs-org_id-container_images)
### [Get instance of container image](https://docs.snyk.io/reference/containerimage#orgs-org_id-container_images-image_id)
### [List instances of image target references for a container image](https://docs.snyk.io/reference/containerimage#orgs-org_id-container_images-image_id-relationships-image_target_refs)
## Custom Base Images
**More information:** [Use Custom Base Image Recommendations](https://docs.snyk.io/scan-with-snyk/snyk-container/use-snyk-container/use-custom-base-image-recommendations)
### [Create a Custom Base Image from an existing container project](https://docs.snyk.io/reference/custom-base-images#custom_base_images)
**More information:** [Use Custom Base Image Recommendations](https://docs.snyk.io/scan-with-snyk/snyk-container/use-snyk-container/use-custom-base-image-recommendations), section [Mark the created Project as a custom base image](https://docs.snyk.io/scan-with-snyk/snyk-container/use-snyk-container/use-custom-base-image-recommendations#mark-the-created-project-as-a-custom-base-image);\
[Versioning schema for custom base images](https://docs.snyk.io/scan-with-snyk/snyk-container/use-snyk-container/use-custom-base-image-recommendations/versioning-schema-for-custom-base-images)
### [Get a custom base image collection](https://docs.snyk.io/reference/custom-base-images#custom_base_images-1)
### [Update a custom base image](https://docs.snyk.io/reference/custom-base-images#custom_base_images-custombaseimage_id)
### [Get a custom base image](https://docs.snyk.io/reference/custom-base-images#custom_base_images-custombaseimage_id-1)
### [Delete a custom base image](https://docs.snyk.io/reference/custom-base-images#custom_base_images-custombaseimage_id-2)
## Dependencies (v1)
### [List all dependencies](https://docs.snyk.io/snyk-api/reference/dependencies-v1)
## Entitlements (v1)
### [List all entitlements](https://docs.snyk.io/reference/entitlements-v1#org-orgid-entitlements)
### [Get an organization's entitlement value](https://docs.snyk.io/reference/entitlements-v1#org-orgid-entitlement-entitlementkey)
## Groups (beta)
### [Get all groups](https://apidocs.snyk.io/?version=2024-10-15#get-/groups)
### [Get a group](https://apidocs.snyk.io/?version=2024-10-15#get-/groups/-group_id-)
**More information:** [Organization and Group identification for Projects using the API](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/organization-and-group-identification-for-projects-using-the-api)
### [Get all SSO connections for a group](https://apidocs.snyk.io/?version=2024-10-15#get-/groups/-group_id-/sso_connections)
### [Get all users using a given SSO connection](https://apidocs.snyk.io/?version=2024-10-15#get-/groups/-group_id-/sso_connections/-sso_id-/users)
### [Delete a user from a Group SSO connection](https://apidocs.snyk.io/?version=2024-10-15#delete-/groups/-group_id-/sso_connections/-sso_id-/users/-user_id-)
**More information:** [Remove members from Groups and Orgs using the API](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/remove-members-from-groups-and-orgs-using-the-api); [Retrieve audit logs of user-initiated activity by API for an Org or Group](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/retrieve-audit-logs-of-user-initiated-activity-by-api-for-an-org-or-group)
## Groups (v1)
### [List all tags in a group](https://docs.snyk.io/reference/groups-v1#group-groupid-tags)
**More information**: [Project tags](https://docs.snyk.io/snyk-platform-administration/snyk-projects/project-tags)
### [Delete tag from group](https://docs.snyk.io/reference/groups-v1#group-groupid-tags-delete)
**More information:** [Project tags](https://docs.snyk.io/snyk-platform-administration/snyk-projects/project-tags)
### [Update group settings](https://docs.snyk.io/reference/groups-v1#group-groupid-settings)
### [View group settings](https://docs.snyk.io/reference/groups-v1#group-groupid-settings-1)
### [List all roles in a group](https://docs.snyk.io/reference/groups-v1#group-groupid-roles)
**More information:** [Update member roles using the API](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/update-member-roles-using-the-api);\
[Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
### [List all organizations in a group](https://docs.snyk.io/reference/groups-v1#group-groupid-orgs)
**More information:** [Org and group identification for Projects](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/organization-and-group-identification-for-projects-using-the-api);\
[Legacy custom mapping](https://docs.snyk.io/implementation-and-setup/enterprise-setup/single-sign-on-sso-for-authentication-to-snyk/custom-mapping/legacy-custom-mapping);\
[api-import Creating import targets data for import](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import/creating-import-targets-data-for-import-command);\
[Scenario: Retrieve a Project snapshot for every Project in a given Group](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#retrieve-a-project-snapshot-for-every-project-in-a-given-group);\
[Scenario: Find all Projects affected by a vulnerability](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#find-all-projects-affected-by-a-vulnerability)
### [Add a member to an organization within a group](https://docs.snyk.io/reference/groups-v1#group-groupid-org-orgid-members)
### [List all members in a group](https://docs.snyk.io/reference/groups-v1#group-groupid-members)
**More information:** [Remove members from Groups and Orgs using the API](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/remove-members-from-groups-and-orgs-using-the-api);\
[Scenario: Assign all users in a given list to all the Organizations a company has (all Organizations in a Group)](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#assign-all-users-in-a-given-list-to-all-the-organizations-a-company-has-all-organizations-in-a-group)
## Groups
### [Get a list of org memberships of a group user](https://docs.snyk.io/reference/groups#groups-group_id-org_memberships)
### [Create a group membership for a user with role](https://docs.snyk.io/reference/groups#groups-group_id-memberships)
### [Get all memberships of the group](https://docs.snyk.io/reference/groups#groups-group_id-memberships-1)
### [Update a role from a group membership](https://docs.snyk.io/reference/groups#groups-group_id-memberships-membership_id)
### [Delete a membership from a group](https://docs.snyk.io/reference/groups#groups-group_id-memberships-membership_id-1)
## IacSettings
### [Update the Infrastructure as Code Settings for an org](https://docs.snyk.io/reference/groups-v1#group-groupid-org-orgid-members)
**More information:** [Use a remote IaC custom rules bundle](https://docs.snyk.io/scan-with-snyk/snyk-iac/current-iac-custom-rules/use-iac-custom-rules-with-cli/use-a-remote-iac-custom-rules-bundle)
### [Get the Infrastructure as Code Settings for an org](https://docs.snyk.io/reference/iacsettings#orgs-org_id-settings-iac-1)
### [Update the Infrastructure as Code Settings for a group](https://docs.snyk.io/reference/iacsettings#groups-group_id-settings-iac)
**More information:** [Use a remote IaC custom rules bundle](https://docs.snyk.io/scan-with-snyk/snyk-iac/current-iac-custom-rules/use-iac-custom-rules-with-cli/use-a-remote-iac-custom-rules-bundle), [IaC custom rules within a pipeline](https://docs.snyk.io/scan-with-snyk/snyk-iac/current-iac-custom-rules/iac-custom-rules-within-a-pipeline);[Use a remote IaC custom rules bundle](https://docs.snyk.io/scan-with-snyk/snyk-iac/current-iac-custom-rules/use-iac-custom-rules-with-cli/use-a-remote-iac-custom-rules-bundle)
### [Get the Infrastructure as Code Settings for a group](https://docs.snyk.io/reference/iacsettings#groups-group_id-settings-iac-1)
## Ignores (v1)
**More information:** [Snyk test and snyk monitor in CI/CD integration](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/snyk-ci-cd-integration-deployment-and-strategies/snyk-test-and-snyk-monitor-in-ci-cd-integration)
### [List all ignores](https://docs.snyk.io/reference/ignores-v1#org-orgid-project-projectid-ignores)
### [Replace ignores](https://docs.snyk.io/reference/ignores-v1#org-orgid-project-projectid-ignore-issueid)
### [Add ignore](https://docs.snyk.io/reference/ignores-v1#org-orgid-project-projectid-ignore-issueid-1)
### [Retrieve ignore](https://docs.snyk.io/reference/ignores-v1#org-orgid-project-projectid-ignore-issueid-1)
**More information:** [Scenario: List all issues including Snyk Code issues in all the Projects in an Organization](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#list-all-issues-including-snyk-code-issues-in-all-the-projects-in-an-organization)
### [Delete ignores](https://docs.snyk.io/reference/ignores-v1#org-orgid-project-projectid-ignore-issueid-3)
## Import Projects (v1)
Projects can be Git repositories, Docker images, containers, configuration files, and much more. For more information, see [Snyk Projects](https://docs.snyk.io/snyk-platform-administration/snyk-projects); the page includes the [Targets definition](https://docs.snyk.io/snyk-platform-administration/snyk-projects#target).
A typical import starts with using the endpoint [Import targets](https://docs.snyk.io/reference/import-projects-v1#org-orgid-integrations-integrationid-import) to request a target to be processed. Then, use the endpoint [Get import job details](https://docs.snyk.io/reference/import-projects-v1#org-orgid-integrations-integrationid-import-jobid) to poll the Import Job AP I for further details on completion and resulting Snyk Projects.
### [Import targets](https://docs.snyk.io/reference/import-projects-v1#org-orgid-integrations-integrationid-import) and [Get import job details](https://docs.snyk.io/reference/import-projects-v1#org-orgid-integrations-integrationid-import-jobid)
Note that the `target.owner` is case-sensitive.
For information on when and how you can use Import targets, see [Git integration on the Import Projects](https://docs.snyk.io/implementation-and-setup/enterprise-implementation-guide/phase-3-gain-visibility/import-projects#git-integration) page in the Enterprise implementation guide.
If a call to the Import targets endpoint fails, use [Get import job detail](https://docs.snyk.io/reference/import-projects-v1#org-orgid-integrations-integrationid-import-jobid)s to help determine why. There are two types of failures:
* The repository was rejected for processing, that is, HTTP status code 201 was not returned. This happens if there is an issue Snyk can see quickly for example:
* The repository does not exist.
* The repository is unreachable by Snyk because the token is invalid or does not have sufficient permissions; there is no default branch.
* The repository was accepted for processing, that is, the user got back HTTP status code 201 and a url to poll, but no projects were detected or some failed. This may occur because:
* There are no Snyk-supported manifests in this repository.
* The repository is archived and the Snyk API calls to fetch files fail.
* The individual project or manifest had issues during processing. In this case Snyk returns success: false with a message in the log.
The poll results return a message per manifest processed, either `success: true` or `success: false.`
**More information:** [api-import Creating import targets data for import](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import/creating-import-targets-data-for-import-command);\
[api-import Kicking off an import](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import/kicking-off-an-import)
**More information Import targets:**\
[Configure integrations](https://docs.snyk.io/implementation-and-setup/team-implementation-guide/phase-2-configure-your-organization/configure-integrations) (Enterprise implementation guide, Phase 2);\
[Import Projects](https://docs.snyk.io/implementation-and-setup/team-implementation-guide/phase-3-gain-visibility/import-projects) (Enterprise implementation guide, Phase 3);\
[Tool: snyk-api-import](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import)\
[api-import Creating import targets data for import](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import/creating-import-targets-data-for-import-command)\
[api-import Kicking off an import](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import/kicking-off-an-import)\
[Scenario:: Identify and import new repositories only](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#identify-and-import-new-repositories-only)\
[Scenario: Detect and import new Projects in a repository into a target](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#detect-new-projects-files-in-repositories-and-import-them-into-a-target-in-snyk-on-a-regular-basis)\
[Scenario: Detect new Projects (files) in repositories and import them into a Target in Snyk on a regular basis](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#detect-new-projects-files-in-repositories-and-import-them-into-a-target-in-snyk-on-a-regular-basis)\
[Import fresh container images](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#import-fresh-container-images)\
[Manage code vulnerabilities ](https://docs.snyk.io/scan-with-snyk/snyk-code/manage-code-vulnerabilities)(Use: Automate importing multiple repositories)
**More information Get import job details:** [Scenario: Import fresh container images](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#import-fresh-container-images);\
[Tool: snyk-api-import](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import)\
[api-import Creating import targets data for import](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import/creating-import-targets-data-for-import-command)\
[api-import Kicking off an import](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import/kicking-off-an-import)
## Integrations (v1)
### [Add new integration](https://docs.snyk.io/reference/integrations-v1#org-orgid-integrations)
**More information:** [Scenario: Rotate or change your Broker token for any reason](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#rotate-or-change-your-broker-token-for-any-reason)
### [List](https://docs.snyk.io/reference/integrations-v1#org-orgid-integrations-1)
**More information:** [Scenario: For a specific event or time, disable all interactions (pull requests, tests) from Snyk to the code base (source control management)](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#for-a-specific-event-or-time-disable-all-interactions-pull-requests-tests-from-snyk-to-the-code-base);\
[api-import Creating import targets data for import](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import/creating-import-targets-data-for-import-command);
### [Get existing integration by type](https://docs.snyk.io/reference/integrations-v1#org-orgid-integrations-type)
### [Update existing integration](https://docs.snyk.io/reference/integrations-v1#org-orgid-integrations-integrationid)
**More information:** [Obtain the required tokens for setup](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/prepare-snyk-broker-for-deployment/obtain-the-tokens-required-to-set-up-snyk-broker);\
[Scenario: For a specific event or time, disable all interactions (pull requests, tests) from Snyk to the code base (source control management](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#for-a-specific-event-or-time-disable-all-interactions-pull-requests-tests-from-snyk-to-the-code-base); [Examples for the Update existing integration endpoint](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/examples-for-the-update-existing-integration-endpoint)
### [Update](https://docs.snyk.io/reference/integrations-v1#org-orgid-integrations-integrationid-settings)
### [Retrieve](https://docs.snyk.io/reference/integrations-v1#org-orgid-integrations-integrationid-settings-1)
### [Clone an integration (with settings and credentials)](https://docs.snyk.io/reference/integrations-v1#org-orgid-integrations-integrationid-clone)
**More information:** [Prepare Snyk Broker for deployment](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/prepare-snyk-broker-for-deployment);\
[Obtain the required tokens for setup](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/prepare-snyk-broker-for-deployment/obtain-the-tokens-required-to-set-up-snyk-broker);\
Scenario: [Create multiple new Organizations that all have the same settings in a given Group](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#create-multiple-new-organizations-that-all-have-the-same-settings-in-a-given-group)
### [Delete credentials](https://docs.snyk.io/reference/integrations-v1#org-orgid-integrations-integrationid-authentication)
### [Switch between broker tokens](https://docs.snyk.io/reference/integrations-v1#org-orgid-integrations-integrationid-authentication-switch-token)
### [Provision new broker token](https://docs.snyk.io/reference/integrations-v1#org-orgid-integrations-integrationid-authentication-provision-token)
**More information:** [Obtain the required tokens for setup](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/prepare-snyk-broker-for-deployment/obtain-the-tokens-required-to-set-up-snyk-broker)
## Invites
See also [Invite users](https://docs.snyk.io/reference/organizations-v1#org-orgid-invite).
### [Invite a user to an organization](https://docs.snyk.io/reference/invites#orgs-org_id-invites)
### [List pending user invitation to an organization](https://docs.snyk.io/reference/invites#orgs-org_id-invites-1)
### [Cancel a pending user invitations to an organization](https://docs.snyk.io/reference/invites#orgs-org_id-invites-invite_id)
## Issues
### [List issues for a package](https://docs.snyk.io/reference/issues#orgs-org_id-packages-purl-issues)
**More information:** [Dart and Flutter](https://docs.snyk.io/supported-languages/supported-languages-list/dart-and-flutter);\
[Rust](https://docs.snyk.io/supported-languages/supported-languages-list/rust):\
[Guidance for Snyk for C++ page, Alternate testing options section](https://docs.snyk.io/supported-languages/supported-languages-list/c-c++/guidance-for-snyk-for-c-c++#alternate-testing-options);\
[Guidance for Java and Kotlin](https://docs.snyk.io/supported-languages/supported-languages-list/java-and-kotlin);\
[Guidance for JavaScript and Node.js](https://docs.snyk.io/supported-languages/supported-languages-list/javascript#unmanaged-javascript);\
[List issues for a package page](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/issues-list-issues-for-a-package)
### [List issues for a given set of packages](https://docs.snyk.io/reference/issues#orgs-org_id-packages-issues) (not available to all customers)
### [Get issues by org ID](https://docs.snyk.io/reference/issues#orgs-org_id-issues)
As of April, 2025, you can retrieve Snyk Code issues using this endpoint. It has the primary file path and primary region in the `source_location` data in `representations` in `coordinates` for an issue.
**More information:** [Scenario: Bulk ignore issues](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#bulk-ignore-issues);\
[List all issues including Snyk Code issues in all the Projects in an Organization](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#list-all-issues-including-snyk-code-issues-in-all-the-projects-in-an-organization)
### [Get an issue](https://docs.snyk.io/reference/issues#orgs-org_id-issues-issue_id) (for an Organization)
### [Get issues by group ID](https://docs.snyk.io/reference/issues#orgs-org_id-issues-issue_id)
**Note:** Remedies are not included in the response.
As of April, 2025, you can retrieve Snyk Code issues using this endpoint. It has the primary file path and primary region in the `source_location` data in `representations` in `coordinates` for an issue.
Additional information: [Reachability](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/reachability-analysis)
### [Get an issue](https://docs.snyk.io/reference/issues#groups-group_id-issues-issue_id) (for a Group)
## Jira (v1)
### [List all jira issues](https://docs.snyk.io/reference/jira-v1#org-orgid-project-projectid-jira-issues)
**More information:** [Jira integration](https://docs.snyk.io/integrations/jira-and-slack-integrations/jira-integration); [Snyk test and snyk monitor in CI/CD integration](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/snyk-ci-cd-integration-deployment-and-strategies/snyk-test-and-snyk-monitor-in-ci-cd-integration)
### [Create jira issue](https://docs.snyk.io/reference/jira-v1#org-orgid-project-projectid-issue-issueid-jira-issuev)
**More information:** [Jira integration](https://docs.snyk.io/integrations/jira-and-slack-integrations/jira-integration);\
[Snyk test and snyk monitor in CI/CD integration](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/snyk-ci-cd-integration-deployment-and-strategies/snyk-test-and-snyk-monitor-in-ci-cd-integration)
## Licenses (v1)
### [List all licenses](https://docs.snyk.io/snyk-api/reference/licenses-v1)
## Monitor (v1)
### [Monitor Dep Graph](https://docs.snyk.io/snyk-api/reference/monitor-v1)
**More information:** [Dep Graph API (Bazel)](https://docs.snyk.io/scan-with-snyk/snyk-open-source/snyk-for-bazel/dep-graph-api)
## Organizations (v1)
**More information:** [Webhook events and payloads](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/webhooks-apis/webhooks)
### [List all the organizations a user belongs to](https://docs.snyk.io/reference/organizations-v1#orgs)
**More information:** [Organization and Group identification for Projects using the API](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/organization-and-group-identification-for-projects-using-the-api);\
[Scenario: Rotate or change your Broker token for any reason](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#rotate-or-change-your-broker-token-for-any-reason)
### [Create a new organization](https://docs.snyk.io/reference/organizations-v1#org)
**More information:** [Set visibility and configure an Organization template](https://docs.snyk.io/implementation-and-setup/enterprise-implementation-guide/phase-2-configure-account/set-visibility-and-configure-an-organization-template) (Enterprise implementation guide Phase 2, Configure accounts);\
[api-import: Creating organizations in Snyk](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import/creating-organizations-in-snyk);\
[Scenario: Create multiple new Organizations that all have the same settings in a given Group](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#create-multiple-new-organizations-that-all-have-the-same-settings-in-a-given-group)
### [Remove organization](https://docs.snyk.io/reference/organizations-v1#org-orgid)
### [Update organization settings](https://docs.snyk.io/reference/organizations-v1#org-orgid-settings)
The only editable attribute of Update organization settings is `requestAccess`.
**More information:** [Scenario: Create multiple new Organizations that all have the same settings in a given Group](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#create-multiple-new-organizations-that-all-have-the-same-settings-in-a-given-group)
### [View organization settings](https://docs.snyk.io/reference/organizations-v1#org-orgid-settings-1)
**More information:** [Scenario: Create multiple new Organizations that all have the same settings in a given Group](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#create-multiple-new-organizations-that-all-have-the-same-settings-in-a-given-group)
### [Provision a user to the organization](https://docs.snyk.io/reference/organizations-v1#org-orgid-provision)
**More information:** [Provision users to Organizations using the AP](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/provision-users-to-organizations-using-the-api):\
[Scenario: Add users to organizations at scale ahead of the first login](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#add-users-to-organizations-at-scale-ahead-of-the-first-login)
### [List pending user provisions](https://docs.snyk.io/reference/organizations-v1#org-orgid-provision-1)
**More information:** [Provision users to Organizations using the API](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/provision-users-to-organizations-using-the-api)
### [Delete pending user provision](https://docs.snyk.io/reference/organizations-v1#org-orgid-provision-2)
**More information:** [Provision users to Organizations using the API](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/provision-users-to-organizations-using-the-api)
### [Set notification settings](https://docs.snyk.io/reference/organizations-v1#org-orgid-notification-settings)
**More information:** [api-import Creating import targets data for import](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import/creating-import-targets-data-for-import-command);\
[Tool: snyk-api-import](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import);
### [Get organization notification settings](https://docs.snyk.io/reference/organizations-v1#org-orgid-notification-settings-1)
### [List members](https://docs.snyk.io/reference/organizations-v1#org-orgid-members)
**More information:** [Update member roles using the API](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/update-member-roles-using-the-api); [Remove members from Groups and Orgs using the API](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/remove-members-from-groups-and-orgs-using-the-api)
### [Update a member in the organization](https://docs.snyk.io/reference/organizations-v1#org-orgid-members-userid)
**More information:** [User role management](https://docs.snyk.io/snyk-platform-administration/user-roles/user-role-management)
### [Remove a member from the organization](https://docs.snyk.io/reference/organizations-v1#org-orgid-members-userid-1)
**More information:** [Remove members from Groups and Orgs using the API](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/remove-members-from-groups-and-orgs-using-the-api);\
[User role management](https://docs.snyk.io/snyk-platform-administration/user-roles/user-role-management)
### [Update a member's role in the organization](https://docs.snyk.io/reference/organizations-v1#org-orgid-members-update-userid)
**More information:** [User role management](https://docs.snyk.io/snyk-platform-administration/user-roles/user-role-management); [Update member roles using the API](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/update-member-roles-using-the-api)
### [Invite users](https://docs.snyk.io/reference/organizations-v1#org-orgid-invite)
**More information:** [Update member roles using the API](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/update-member-roles-using-the-api);\
[Scenario: Assign all users in a given list to all the Organizations a company has (all Organizations in a Group)](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#assign-all-users-in-a-given-list-to-all-the-organizations-a-company-has-all-organizations-in-a-group)
## Orgs (GA and beta)
### [List accessible organizations](https://docs.snyk.io/reference/orgs#orgs)
**More information:** [Prerequisites for Snyk Apps](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis/prerequisites-for-snyk-apps);\
[Organization and Group identification for Projects using the API](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/organization-and-group-identification-for-projects-using-the-api)
### [Update organization](https://docs.snyk.io/reference/orgs#orgs-org_id)
### [Create a org membership for a user with role](https://docs.snyk.io/reference/orgs#orgs-org_id-memberships)
### [Get all memberships of the org](https://docs.snyk.io/reference/orgs#orgs-org_id-memberships-1)
### [Update a org membership for user with role](https://docs.snyk.io/reference/orgs#orgs-org_id-memberships-membership_id)
### [List all organizations in a group](https://docs.snyk.io/reference/orgs#groups-group_id-orgs)
### [Get an ORG](https://apidocs.snyk.io/?version=2024-10-15#get-/orgs/-org_id-) (beta)
**More information:** [Org and group identification for Projects](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/organization-and-group-identification-for-projects-using-the-api)
## Projects (v1)
**More information:** [Project type responses from API](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/project-type-responses-from-the-api);\
[Webhook events and payloads](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/webhooks-apis/webhooks)
### [Update a project](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid)
### [Retrieve a single project](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-1)
**More information:** [Project type responses from the API](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/project-type-responses-from-the-api)
### [Delete a project](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-2)
More information: [Project type responses from the API](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/project-type-responses-from-the-api); [Scenario: Import fresh container images](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#import-fresh-container-images)
### [Add a tag to a project](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-tags)
**More information:** [Project tags](https://docs.snyk.io/snyk-platform-administration/snyk-projects/project-tags); [Set up Insights: Associating Snyk Open Source, Code, and Container Projects](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/set-up-insights/set-up-insights-associating-snyk-open-source-code-and-container-projects);\
[Scenario: Rotate or change your Broker token for any reason](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#rotate-or-change-your-broker-token-for-any-reason)
### [Remove a tag from a project](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-tags-remove)
**More information:** [Project tags](https://docs.snyk.io/snyk-platform-administration/snyk-projects/project-tags)
### [Update project settings](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-settings)
### [List project settings](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-settings-1)
### [Delete project settings](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-settings-2)
### [Move project to a different organization](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-move)
**More informatiion:** [Scenario: Move projects from one organization to another](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#move-projects-from-one-organization-to-another)
### [List all project issue paths](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-issue-issueid-paths)
**More information:** [Project issue paths API endpoints](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/project-issue-paths-api-endpoints)
### [Get Project dependency graph](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-dep-graph)
### [Deactivate](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-deactivate) (a project)
### [Applying (project) attributes](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-attributes)
By using the API endpoint Applying attributes, you can set attributes for Snyk Projects including business criticality, lifecycle stage, and environment once the project has been created . To do so:
* Import the project using the API endpoint [Import targets](https://docs.snyk.io/reference/import-projects-v1#org-orgid-integrations-integrationid-import).
* Get the status API ID from [Import targets](https://docs.snyk.io/reference/import-projects-v1#org-orgid-integrations-integrationid-import).
* Poll using the endpoint [Import job details](https://docs.snyk.io/reference/import-projects-v1#org-orgid-integrations-integrationid-import-jobid) until all imports have completed.
* Parse the project IDs from the `projectURL` field.
* Use the endpoint [Applying attributes](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-attributes) to set the project attributes.
**More information:** [Project attributes](https://docs.snyk.io/snyk-platform-administration/snyk-projects/project-attributes)
### [List all Aggregated (Project) issues](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-aggregated-issues)
The Snyk V1 API endpoint [List all aggregated issues](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-aggregated-issues) returns an array of `ignoreReasons` for each vulnerability. This happens because ignores implemented using the CLI and API are path-based and thus potentially could have different `ignoreReasons` for different paths. Because List all aggregated issues returns only one issue for all paths, the entire set of reasons is returned. Snyk groups issues together by their identifier, so one response for the List all aggregated issues endpoint could correspond to the same issue across multiple paths. Thus the `ignoredReason` is across all issues that are aggregated and applies to that single grouped issue.
**More information:** [Scenario: List all issues including Snyk Code issues in all the Projects in an Organization](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#list-all-issues-including-snyk-code-issues-in-all-the-projects-in-an-organization)
### [Activate](https://docs.snyk.io/reference/projects-v1#org-orgid-project-projectid-activate) (a project)
## Projects
### [List all Projects for an Org with the given Org ID](https://docs.snyk.io/reference/projects#orgs-org_id-projects)
The query-string parameter for types is optional. The endpoint does not enforce specific project types and will return `no matching projects` if you enter a string that does not match a requested project type. See [Project type responses from the AP](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/project-type-responses-from-the-api)I for a list of project types.
**More information:** [Slack app (for Jira integration)](https://docs.snyk.io/integrations/jira-and-slack-integrations/slack-app) (Use: Find your Project ID);\
[Snyk Projects](https://docs.snyk.io/snyk-platform-administration/snyk-projects);\
[Project information](https://docs.snyk.io/snyk-platform-administration/snyk-projects/project-information);\
[Project attributes](https://docs.snyk.io/snyk-platform-administration/snyk-projects/project-attributes);\
[Scenario: Find all Projects affected by a vulnerability](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#find-all-projects-affected-by-a-vulnerability);\
[Scenario: List all issues including Snyk Code issues in all the Projects in an Organization](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#list-all-issues-including-snyk-code-issues-in-all-the-projects-in-an-organization);\
[Scenario: Bulk ignore issues](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#bulk-ignore-issues);\
[Scenario: Tag all Projects in Snyk](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#tag-all-projects-in-snyk);\
[Scenario: Import fresh container images](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#import-fresh-container-images);\
[Scenario: Detect and import new Projects in a repository into a target](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#detect-new-projects-files-in-repositories-and-import-them-into-a-target-in-snyk-on-a-regular-basis)
### [Updates project by project ID](https://docs.snyk.io/reference/projects#orgs-org_id-projects-project_id)
**More information:** [View and edit Project settings](https://docs.snyk.io/snyk-platform-administration/snyk-projects/view-and-edit-project-settings);\
[Start scanning](https://docs.snyk.io/scan-with-snyk/start-scanning) (Use: Set test frequency)
### [Get project by project ID](https://docs.snyk.io/reference/projects#orgs-org_id-projects-project_id-1)
### [Delete project by project ID](https://docs.snyk.io/reference/projects#orgs-org_id-projects-project_id-2)
## Pull request templates
### [Create or update pull request template for group](https://docs.snyk.io/reference/pull-request-templates#groups-group_id-settings-pull_request_template)
**More information:** [Create and manage a custom PR template using the API](https://docs.snyk.io/scan-with-snyk/pull-requests/snyk-pull-or-merge-requests/customize-pr-templates/apply-a-custom-pr-template#create-and-manage-a-custom-pr-template-using-the-api)
### [Get pull request template for group](https://docs.snyk.io/reference/pull-request-templates#groups-group_id-settings-pull_request_template-1)
### [Delete pull request template for group](https://docs.snyk.io/reference/pull-request-templates#groups-group_id-settings-pull_request_template-2)
## Reporting API (v1)
The V1 Reporting endpoints support only Snyk legacy reporting, not the latest release. Thus, these endpoints are not available in single-tenant implementations or in the multi-tenant regions US-02, EU, and AU. In those regions, use the [Issues](https://docs.snyk.io/snyk-api/reference/issues) REST API.
The V1 Reporting API underlies Snyk legacy reporting. Using the V1 Reporting API, you can find answers to questions like how many issues your Organization has, or how many tests have been conducted in a given time period.
The rate limit is up to 70 requests per minute, per user. For all requests above the limit, the response will have the status code `429: Too many requests`, until requests stop for the duration of the rate-limiting interval (one minute). For more information see [Rate limiting for V1 API](https://docs.snyk.io/v1-api#rate-limiting).
**More information:** [Legacy reports](https://docs.snyk.io/manage-risk/reporting/legacy-reports);\
[Dependencies and licenses](https://docs.snyk.io/manage-risk/reporting/dependencies-and-licenses)
### [Get list of issues](https://docs.snyk.io/reference/reporting-api-v1#reporting-issues)
See notes for [Get list of latest issues](#get-list-of-latest-issues).
### [Get list of latest issues](https://docs.snyk.io/reference/reporting-api-v1#reporting-issues-latest)
To list all Projects that have a vulnerability linked to a CVE, use the capability to filter on strings with the reporting endpoints [Get list of latest issues](https://docs.snyk.io/reference/reporting-api-v1#reporting-issues-latest) and [Get List of issues](https://docs.snyk.io/reference/reporting-api-v1#reporting-issues). Filter by the identifier attribute.
To get a list of issues that have been fixed, use the endpoint [Get list of latest issues](https://docs.snyk.io/reference/reporting-api-v1#reporting-issues-latest) and filter by `“isFixed”: true` in the request body. This endpoint also provides a [list of all IaC issues](https://docs.snyk.io/scan-with-snyk/snyk-iac/view-snyk-iac-issue-reports#api-access-to-iac-issues).
**More information:** [Priority score](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/priority-score);\
[View Snyk IaC issue reports](https://docs.snyk.io/scan-with-snyk/snyk-iac/view-snyk-iac-issue-reports);\
[Scenario: Retrieve a Project snapshot for every Project in a given Group](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#retrieve-a-project-snapshot-for-every-project-in-a-given-group);\
[Scenario: Bulk ignore issues](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#bulk-ignore-issues)**More information:** [Find all Projects affected by a vulnerability](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#find-all-projects-affected-by-a-vulnerability)
### [Get test counts](https://docs.snyk.io/reference/reporting-api-v1#reporting-counts-tests)
### [Get project counts](https://docs.snyk.io/reference/reporting-api-v1#reporting-counts-projects)
### [Get latest project counts](https://docs.snyk.io/reference/reporting-api-v1#reporting-counts-projects-latest)
### [Get issue counts](https://docs.snyk.io/reference/reporting-api-v1#reporting-counts-issues)
### [Get latest issue counts](https://docs.snyk.io/reference/reporting-api-v1#reporting-counts-issues-latest)
## SBOM (GA and beta)
**More information:** [Rust](https://docs.snyk.io/supported-languages/supported-languages-list/rust); [SBOM test endpoints](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/sbom-apis/rest-api-endpoint-test-an-sbom-document-for-vulnerabilities)
### [Get a project’s SBOM document](https://docs.snyk.io/snyk-api/reference/sbom)
**More information:** [Get a project’s SBOM document](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/sbom-apis/rest-api-get-a-projects-sbom-document)
### [Create an SBOM test run](https://apidocs.snyk.io/?version=2024-10-15#post-/orgs/-org_id-/sbom_tests) (beta)
**More information:** [Test an SBOM document for vulnerabilities](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/sbom-apis/rest-api-endpoint-test-an-sbom-document-for-vulnerabilities)
### [Gets an SBOM test run status](https://apidocs.snyk.io/?version=2024-10-15#get-/orgs/-org_id-/sbom_tests/-job_id-) (beta)
### [Gets an SBOM test run result](https://apidocs.snyk.io/?version=2024-10-15#get-/orgs/-org_id-/sbom_tests/-job_id-/results) (beta)
**More information:** [Test an SBOM document for vulnerabilities](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/sbom-apis/rest-api-endpoint-test-an-sbom-document-for-vulnerabilities)
## SastSettings
### [Enable/Disable the Snyk Code settings for an org](https://docs.snyk.io/reference/sastsettings#orgs-org_id-settings-sast)
**More information:** [Enable Snyk Code](https://docs.snyk.io/implementation-and-setup/enterprise-implementation-guide/phase-2-configure-account/set-visibility-and-configure-an-organization-template/enable-snyk-code) (Enterprise implementation guide, Phase 2)
### [Retrieves the SAST settings for an org](https://docs.snyk.io/reference/sastsettings#orgs-org_id-settings-sast-1)
## ServiceAccounts
**More information:** [Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api); [Choose a service account type to use with Snyk APIs](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/choose-a-service-account-type-to-use-with-snyk-apis)
### [Create a service account for an organization](https://docs.snyk.io/reference/serviceaccounts#orgs-org_id-service_accounts)
**More information:** [Service accounts using OAuth 2.0](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/service-accounts-using-oauth-2.0);\
[Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
### [Get a list of organization service accounts](https://docs.snyk.io/reference/serviceaccounts#orgs-org_id-service_accounts-1)
**More information:** [Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
### [Update an organization service account](https://docs.snyk.io/reference/serviceaccounts#orgs-org_id-service_accounts-serviceaccount_id)
**More information:** [Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
### [Get an organization service account](https://docs.snyk.io/reference/serviceaccounts#orgs-org_id-service_accounts-serviceaccount_id-1)
**More information:** [Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
### [Delete a service account in an organization](https://docs.snyk.io/reference/serviceaccounts#orgs-org_id-service_accounts-serviceaccount_id-2)
**More information:** [Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
### [Manage an organization service account’s client secret](https://docs.snyk.io/reference/serviceaccounts#orgs-org_id-service_accounts-serviceaccount_id-secrets)
**More information:** [Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
### [Create a service account for a group](https://docs.snyk.io/reference/serviceaccounts#groups-group_id-service_accounts)
**More information:** [Service accounts using OAuth 2.0](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/service-accounts-using-oauth-2.0);\
[Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
### [Get a list of group service accounts](https://docs.snyk.io/reference/serviceaccounts#groups-group_id-service_accounts-1)
**More information:** [Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
### [Update a group service account](https://docs.snyk.io/reference/serviceaccounts#groups-group_id-service_accounts-serviceaccount_id)
**More information:** [Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
### [Get a group service account](https://docs.snyk.io/reference/serviceaccounts#groups-group_id-service_accounts-serviceaccount_id-1)
**More information:** [Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
### [Delete a group service account](https://docs.snyk.io/reference/serviceaccounts#groups-group_id-service_accounts-serviceaccount_id)
**More information:** [Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
### [Manage a group service account’s client secret](https://docs.snyk.io/reference/serviceaccounts#groups-group_id-service_accounts-serviceaccount_id-secrets)
**More information:** [Manage service accounts using the Snyk API](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/manage-service-accounts-using-the-snyk-api)
## SlackSettings
**More information:** [Slack app (for Jira integration)](https://docs.snyk.io/integrations/jira-and-slack-integrations/slack-app)
### [Create new Slack notification default settings](https://docs.snyk.io/reference/slacksettings#orgs-org_id-slack_app-bot_id)
### [Get Slack integration default notification settings](https://docs.snyk.io/reference/slacksettings#orgs-org_id-slack_app-bot_id-1)
### [Remove the given Slack App integration](https://docs.snyk.io/reference/slacksettings#orgs-org_id-slack_app-bot_id-2)
### [Slack notification settings override for projects](https://docs.snyk.io/reference/slacksettings#orgs-org_id-slack_app-bot_id-projects)
**More information:** [Slack app (JIra integration)](https://docs.snyk.io/integrations/jira-and-slack-integrations/slack-app) (Use: List all Slack notification customizations for a project);\
[api-import Creating orgnizations in Snyk](https://docs.snyk.io/scan-with-snyk/snyk-tools/tool-snyk-api-import/creating-organizations-in-snyk);\\
### [Create a new Slack settings override for a given project](https://docs.snyk.io/reference/slacksettings#orgs-org_id-slack_app-bot_id-projects-project_id)
**More information:** [Slack app(for Jira integration)](https://docs.snyk.io/integrations/jira-and-slack-integrations/slack-app) (Use: Create a Slack notification customization for a Project)
### [Update Slack notification settings for a project](https://docs.snyk.io/reference/slacksettings#orgs-org_id-slack_app-bot_id-projects-project_id-1)
**More information:** [Slack app (Jira integration)](https://docs.snyk.io/integrations/jira-and-slack-integrations/slack-app) (Use: Update a Slack notification customization for a Project)
### [Remove Slack settings override for a project](https://docs.snyk.io/reference/slacksettings#orgs-org_id-slack_app-bot_id-projects-project_id-2)
**More information:** [Slack app (for Jira integration)](https://docs.snyk.io/integrations/jira-and-slack-integrations/slack-app) (Use: Delete a Slack notification customization for a Project)
## Slack
### [Get a list of Slack channels](https://docs.snyk.io/reference/slack#orgs-org_id-slack_app-tenant_id-channels)
### [Get Slack Channel name by Slack Channel ID](https://docs.snyk.io/reference/slack#orgs-org_id-slack_app-tenant_id-channels-channel_id)
## Snapshots (v1)
### [List all project snapshots](https://docs.snyk.io/reference/snapshots-v1#org-orgid-project-projectid-history)
### [List all project snapshot issue paths](https://docs.snyk.io/reference/snapshots-v1#org-orgid-project-projectid-history-snapshotid-issue-issueid-paths)
**More information:** [Project issue paths API endpoints](https://docs.snyk.io/snyk-api/api-endpoints-index-and-tips/project-issue-paths-api-endpoints)
### [List all project snapshot aggregated issues](https://docs.snyk.io/reference/snapshots-v1#org-orgid-project-projectid-history-snapshotid-aggregated-issues)
## Targets
### [Get targets by org ID](https://docs.snyk.io/reference/targets#orgs-org_id-targets)
**More information:** [Target definition on the Snyk Projects page](https://docs.snyk.io/snyk-platform-administration/snyk-projects#target);\
[Scenario: Identify and import new repositories only](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#detect-new-projects-files-in-repositories-and-import-them-into-a-target-in-snyk-on-a-regular-basis);\
[Scenario: Detect new Projects (files) in repositories and import them into a Target in Snyk on a regular basis](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#identify-and-import-new-repositories-only)
### [Get target by target ID](https://docs.snyk.io/reference/targets#orgs-org_id-targets-target_id)
This endpoint retrieves a list of Snyk Targets, which is used if you want to delete Targets by target ID.
### [Delete target by target ID](https://docs.snyk.io/reference/targets#orgs-org_id-targets-target_id-1)
This endpoint deletes the specified Targets and also deletes all the Projects in those Targets automatically.
## Test (v1)
**More information:** [Guidance for Java and Kotlin](https://docs.snyk.io/supported-languages/supported-languages-list/java-and-kotlin);\
[Start scanning](https://docs.snyk.io/scan-with-snyk/start-scanning);\
[Scan open-source libraries and licenses](https://docs.snyk.io/scan-with-snyk/snyk-open-source/scan-open-source-libraries-and-licenses)
### [Test package.json & yarn-lock file](https://docs.snyk.io/reference/test-v1#test-yarn)
### [Test sbt file](https://docs.snyk.io/reference/test-v1#test-sbt)
### [sbt\_Test for issues in a public package by group id, artifact id and version](https://docs.snyk.io/reference/test-v1#test-sbt-groupid-artifactid-version)
### [Test gemfile.lock file](https://docs.snyk.io/reference/test-v1#test-rubygems)
### [Test for issues in a public gem by name and version](https://docs.snyk.io/reference/test-v1#test-rubygems-gemname-version)
### [Test requirements.txt file](https://docs.snyk.io/reference/test-v1#test-pip) (pip)
### [Pip\_Test for issues in a public (pip) package by name and version](https://docs.snyk.io/reference/test-v1#test-pip-packagename-version)
### [Test package.json & package-lock.json file](https://docs.snyk.io/reference/test-v1#test-npm)
### [Test for issues in a public package by name and version](https://docs.snyk.io/reference/test-v1#test-npm-packagename-version) (npm)
**More information:** [Guidance for JavaScript and Node.js](https://docs.snyk.io/supported-languages/supported-languages-list/javascript#unmanaged-javascript)
### [Test maven file](https://docs.snyk.io/reference/test-v1#test-maven)
### [Test for issues in a public package by group id, artifact id and version](https://docs.snyk.io/reference/test-v1#test-maven-groupid-artifactid-version) (Maven)
**More information:** [Guidance for Java and Kotlin](https://docs.snyk.io/supported-languages/supported-languages-list/java-and-kotlin)
### [Test gradle file](https://docs.snyk.io/reference/test-v1#test-gradle)
### [Test for issues in a public package by group, name and version](https://docs.snyk.io/reference/test-v1#test-gradle-group-name-version) (Gradle)
### [Test vendor.json file](https://docs.snyk.io/reference/test-v1#test-govendor)
### [Test Gopkg.toml & Gopkg.lock File](https://docs.snyk.io/reference/test-v1#test-golangdep)
### [Test Dep Graph](https://docs.snyk.io/reference/test-v1#test-dep-graph)
**More information:** [Dep Graph API](https://docs.snyk.io/scan-with-snyk/snyk-open-source/snyk-for-bazel/dep-graph-api) (Bazel);\
[Unmanaged JavaScript](https://docs.snyk.io/supported-languages/supported-languages-list/javascript#unmanaged-javascript);\
[Start scanning](https://docs.snyk.io/scan-with-snyk/start-scanning)
### [Test composer.json & composer.lock file](https://docs.snyk.io/reference/test-v1#test-composer)
## Users (v1)
### [Get user details](https://docs.snyk.io/reference/users-v1#user-userid)
### [Get My Details](https://docs.snyk.io/reference/users-v1#user-me)
### [Modify organization notification settings](https://docs.snyk.io/reference/users-v1#user-me-notification-settings-org-orgid)
### [Get organization notification settings](https://docs.snyk.io/reference/users-v1#user-me-notification-settings-org-orgid-1)
### [Modify project notification settings](https://docs.snyk.io/reference/users-v1#user-me-notification-settings-org-orgid-project-projectid)
### [Get project notification settings](https://docs.snyk.io/reference/users-v1#user-me-notification-settings-org-orgid-project-projectid-1)
## Users
### [My User Details](https://docs.snyk.io/snyk-api/reference/users)
### [Update a user’s role in a group](https://apidocs.snyk.io/?version=2024-10-15#patch-/groups/-group_id-/users/-id-) (beta)
Note: Use this endpoint to remove users from a group.
**More information:** [Remove members from Groups and Orgs using the API](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/remove-members-from-groups-and-orgs-using-the-api)
### [Get user by ID](https://apidocs.snyk.io/?version=2024-10-15#get-/orgs/-org_id-/users/-id-) (beta)
## Webhooks (v1)
### [Create a webhook](https://docs.snyk.io/reference/webhooks-v1#org-orgid-webhooks)
**More information:** [Scenario: For a specific event or time, disable all interactions (pull requests, tests) from Snyk to the code base (source control management)](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#for-a-specific-event-or-time-disable-all-interactions-pull-requests-tests-from-snyk-to-the-code-base)
### [List webhooks](https://docs.snyk.io/reference/webhooks-v1#org-orgid-webhooks-1)
**More information:**
### [Retrieve a webhook](https://docs.snyk.io/reference/webhooks-v1#org-orgid-webhooks-webhookid)
### [Delete a webhook](https://docs.snyk.io/reference/webhooks-v1#org-orgid-webhooks-webhookid-1)
**More information:** [Scenario: For a specific event or time, disable all interactions (pull requests, tests) from Snyk to the code base (source control management](https://docs.snyk.io/snyk-api/scenarios-for-using-the-snyk-api#for-a-specific-event-or-time-disable-all-interactions-pull-requests-tests-from-snyk-to-the-code-base)
### [Ping a webhook](https://docs.snyk.io/reference/webhooks-v1#org-orgid-webhooks-webhookid-ping)
\\
---
# Source: https://docs.snyk.io/snyk-api/api-end-of-life-eol-process-and-migration-guides/api-eol-endpoints-and-key-dates.md
# API EOL endpoints and key dates
## APIs at EOL
Beginning July 22, 2024, the following endpoints will follow the EOL process:
| Endpoint | Endpoint type | EOL date | Migration guide |
| ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------- | ---------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| /v1/reporting | v1 | April 27, 2026 | [V1 Reporting APIs to Export API migration guide](https://docs.snyk.io/snyk-api/api-end-of-life-eol-process-and-migration-guides/guides-to-migration/v1-reporting-apis-to-export-api-migration-guide) |
| [Group and org level audit logs](https://docs.snyk.io/snyk-api/reference/audit-logs) | v1 | January 22, 2025 | [Search Audit Logs (Group and Org) v1 API to GA REST Audit logs API migration guide](https://docs.snyk.io/snyk-api/api-end-of-life-eol-process-and-migration-guides/guides-to-migration/search-audit-logs-group-and-org-v1-api-to-ga-rest-audit-logs-api-migration-guide) |
| Get all issues by [Org](https://apidocs.snyk.io/experimental?version=2023-03-10~experimental&_gl=1*d7o8is*_gcl_aw*R0NMLjE3MTIwNjc4NjcuQ2owS0NRancyYTZ3QmhDVkFSSXNBQlBlSDF0VG1UNmo0cnNrQTVPRmNLVU02cFMyNVc1Q3lpWWhLRFVqZGdfWDZTREJ6Z0NWSGZTZUtzY2FBb3lORUFMd193Y0I.*_gcl_au*MTU3NDc2MzU2LjE3MTI5Mzg4MzA.*_ga*MTE2NjY3NTQyNC4xNjQ3OTU0NjA1*_ga_X9SH3KP7B4*MTcxOTQwNzU4My4yNjguMS4xNzE5NDA3ODA1LjQ5LjAuMA..#get-/orgs/-org_id-/issues) and [Group](https://apidocs.snyk.io/experimental?version=2023-03-10~experimental&_gl=1*d7o8is*_gcl_aw*R0NMLjE3MTIwNjc4NjcuQ2owS0NRancyYTZ3QmhDVkFSSXNBQlBlSDF0VG1UNmo0cnNrQTVPRmNLVU02cFMyNVc1Q3lpWWhLRFVqZGdfWDZTREJ6Z0NWSGZTZUtzY2FBb3lORUFMd193Y0I.*_gcl_au*MTU3NDc2MzU2LjE3MTI5Mzg4MzA.*_ga*MTE2NjY3NTQyNC4xNjQ3OTU0NjA1*_ga_X9SH3KP7B4*MTcxOTQwNzU4My4yNjguMS4xNzE5NDA3ODA1LjQ5LjAuMA..#get-/groups/-group_id-/issues)\* | Non-GA REST | October 22, 2024 | [REST Issues experimental API to GA API migration guide](https://docs.snyk.io/snyk-api/api-end-of-life-eol-process-and-migration-guides/guides-to-migration/rest-issues-experimental-api-to-ga-api-migration-guide) |
{% hint style="info" %}
\*Experimental versions from 2023-03-10 inclusive up to 2023-09-29 exclusive.
{% endhint %}
## Brownouts
A brownout occurs when Snyk temporarily suspends an endpoint from being usable, returning a `410 gone` response when a user calls the endpoint.
Snyk brownouts for APIs that are part of an end-of-life cycle will occur at 12:00 UTC. For the end-of-life cycle beginning July 22, 2024, the brownouts will occur on the following dates. Users will see a reminder two weeks before the brownout through an announcement on [updates.snyk.io](http://updates.snyk.io/):
| Endpoints | Brownout date | Duration |
| ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------- | -------- |
| Get all issues by [Org](https://apidocs.snyk.io/experimental?version=2023-03-10~experimental&_gl=1*d7o8is*_gcl_aw*R0NMLjE3MTIwNjc4NjcuQ2owS0NRancyYTZ3QmhDVkFSSXNBQlBlSDF0VG1UNmo0cnNrQTVPRmNLVU02cFMyNVc1Q3lpWWhLRFVqZGdfWDZTREJ6Z0NWSGZTZUtzY2FBb3lORUFMd193Y0I.*_gcl_au*MTU3NDc2MzU2LjE3MTI5Mzg4MzA.*_ga*MTE2NjY3NTQyNC4xNjQ3OTU0NjA1*_ga_X9SH3KP7B4*MTcxOTQwNzU4My4yNjguMS4xNzE5NDA3ODA1LjQ5LjAuMA..#get-/orgs/-org_id-/issues) and [Group](https://apidocs.snyk.io/experimental?version=2023-03-10~experimental&_gl=1*d7o8is*_gcl_aw*R0NMLjE3MTIwNjc4NjcuQ2owS0NRancyYTZ3QmhDVkFSSXNBQlBlSDF0VG1UNmo0cnNrQTVPRmNLVU02cFMyNVc1Q3lpWWhLRFVqZGdfWDZTREJ6Z0NWSGZTZUtzY2FBb3lORUFMd193Y0I.*_gcl_au*MTU3NDc2MzU2LjE3MTI5Mzg4MzA.*_ga*MTE2NjY3NTQyNC4xNjQ3OTU0NjA1*_ga_X9SH3KP7B4*MTcxOTQwNzU4My4yNjguMS4xNzE5NDA3ODA1LjQ5LjAuMA..#get-/groups/-group_id-/issues)\* | September 12 | 1 hour |
| [Group and org level audit logs](https://docs.snyk.io/snyk-api/reference/audit-logs) | October 8 | 1 hour |
| [Group and org level audit logs](https://docs.snyk.io/snyk-api/reference/audit-logs) | November 12 | 2 hours |
| [Group and org level audit logs](https://docs.snyk.io/snyk-api/reference/audit-logs) | December 10 | 4 hours |
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/api-rate-limit-control-for-scm-contributors-count.md
# API rate limit control for scm-contributors-count
## Azure DevOps
Azure DevOps has a unique way of limiting the API call rate with their own "TSTU" concept as described in this [guide](https://docs.microsoft.com/en-us/azure/devops/integrate/concepts/rate-limits?view=azure-devops).
The `snyk-scm-contributors-count` tool applies a strict limit of a maximum of two calls per second to deal with the rate limit.
## Bitbucket Cloud
On Bitbucket Cloud, the API rate limit is 1,000 calls per hour for authenticated users as described in this [guide](https://support.atlassian.com/bitbucket-cloud/docs/api-request-limits/).
The `snyk-scm-contributors-count` tool applies a strict limit of a maximum of 1,000 calls per hour to deal with the rate limit and an additional regulating mechanism to deal with 429 responses ("too many calls").
## Bitbucket Server
On Bitbucket Server, the system admin has full control of the API rate limiting as described in this [guide](https://confluence.atlassian.com/bitbucketserver/improving-instance-stability-with-rate-limiting-976171954.html).
The `snyk-scm-contributors-count` tool applies a moderate limit of a max 1000 calls per hour and additional regulating mechanism to deal with 429 responses ("too many calls")
## GitHub
On GithHub, the API rate limit is 5,000 calls per hour for authenticated users as described in this [guide](https://docs.github.com/en/developers/apps/building-github-apps/rate-limits-for-github-apps).
The `snyk-scm-contributors-count` tool applies a strict limit of a maximum of 4,500 calls per hour to deal with the rate limit and an additional regulating mechanism to deal with 429 responses ("too many calls").
## GitHub Enterprise
On Github Enterprise, the API rate limit is 5,000 calls per hour for authenticated users as described in this [guide](https://docs.github.com/en/developers/apps/building-github-apps/rate-limits-for-github-apps).
The `snyk-scm-contributors-count` tool applies a strict limit of a maximum of 3 calls per second which amounts to 10,800 calls per hour, to deal with the rate limit and an additional regulating mechanism to deal with 429 responses ("too many calls").
## GitLab and GitLab Server
On GitLab, the API rate limit is 300 calls per minute for authenticated users as described in this [guide](https://docs.gitlab.com/ee/user/gitlab_com/index.html#gitlabcom-specific-rate-limits) and this [guide](https://docs.gitlab.com/ee/user/admin_area/settings/rate_limits_on_raw_endpoints.html) for GitLab Server.
The `snyk-scm-contributors-count` tool applies a strict limit of a maximum of 120 calls per minute to deal with the rate limit and an additional regulating mechanism to deal with 429 responses ("too many calls").
{% hint style="info" %}
On GitLab Server, the API rate control is configurable by the admin, as described in the [guide](https://docs.gitlab.com/ee/user/admin_area/settings/rate_limits_on_raw_endpoints.html).
{% endhint %}
---
# Source: https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/ignore-issues/consistent-ignores-for-snyk-code/api.md
# Consistent Ignores for Snyk Code API
You can manage ignores individually through the [Snyk Policies API (REST)](https://docs.snyk.io/snyk-api/reference/policies).
The SARIF output from Snyk CLI contains the `snyk/asset/finding/v1` identifier used to manage ignores at the start of the Early Access program.
This API leverages the `snyk/asset/finding/v1` identifier and not the `issueId` used by the legacy ignores API. Consider migrating any scripts or automation that rely on the legacy ignores API to the new policy API.
---
# Source: https://docs.snyk.io/manage-risk/analytics/application-analytics.md
# Application analytics
The Analytics menu is available at the tenant level, under the Application Analytics tab. Application Analytics is designed to highlight areas for improvement, emerging risks, and previously overlooked vulnerabilities to support managers and engineering teams.
The dashboard displays essential data such as the status and trends of open issues, control coverage, and repository metadata. It also shows the state of imported assets. It provides a comprehensive and at-a-glance review of information from different viewpoints, such as asset class, application, or team, with a global filter bar to enhance your experience.
## Overview
Application Analytics enables you to review and explore your program status and results from a top-down approach. You can start the exploration from a high, general level over applications, teams (owners), or asset classes, and then narrow it down to the asset level.\
You can enhance the security of your application by identifying areas for improvement, recognizing developing risks, and addressing blind spots. The Application Analytics retrieves the data from all the Groups available for the tenant.
{% hint style="info" %}
If you are using Snyk Essentials, navigate to the asset dashboard page to learn more about your assets or remain on the [Analytics](https://docs.snyk.io/manage-risk/analytics/issues-analytics) page to explore the detected issues.
{% endhint %}
Harnessing Application Analytics provides answers to questions such as:
* Which sensitive assets are being publicly exposed and not tested according to the coverage policy?
* Which applications and code owners bear the most risk in terms of accumulated critical and high issues, and how do they compare to others?
* How many repositories exist without a clear association to an application or a code owner? And are new assets being associated as expected?
Application Analytics Overview
## Filters and views
You can customize your data by using the available filters, dimension views, and specific timeframes.
Filters are applied at the tenant level, and after being customized, they have an impact on all the reports and statistics presented on the Application Analytics page.
You can refine the data even more by using the View by options. This focuses the data on specific dimensions: Asset Class, Application, or Owner.
### Filters
The filters are located at the top left of your Application Analytics page, and you can customize them based on your needs.
The following are the available filters:
| Filter | Description |
| -------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Groups | Provides a list of all the available Groups that exist for the selected tenant. You can customize the selection and focus only on specific Groups. The default setting is providing information for all available groups.
|
| Issue severity | Provides a list with all available types of severity for an issue. The default setting provides information about issues with Critical and High severity. |
| Add filter
Asset type Asset classes Assets application Assets owner Asset risk factors Issues source | You can add filters for a more customized data analysis.
Asset type - filter by the asset type (Container image, Repository) Asset classes - filter by the asset class (A, B, C, D) Assets application - filter by the application for which you want to see the assets Assets owner - filter by the repository owner of the analyzed assets Asset risk factors - filter by specific risk factors of the analyzed assets Issues source - filter by the source of the analyzed issues |
| Reset filters | Resets the filters to the default state |
### Views
Managing an AppSec program can be challenging, especially ensuring complete visibility of assets and issues. Identifying assets that require protection and monitoring them with all applicable ASTs can be difficult. This task becomes more complex with new assets and misconfigurations in ASTs, leading to incomplete coverage and critical visibility.\
\
AppSec teams must maintain a comprehensive understanding of risks and vulnerabilities linked to applications and their respective owners.
Viewing metrics for an application or owner is much more meaningful and helps to:
* Clearly communicate the status and developing trends to all levels and groups across the enterprise.
* Identify who should be aware of the situation and who should take action.
* Make comparisons and conclude where more attention is needed.
Application analytics provides you with new levels of visibility over your important assets, applications, and code owners (teams) and helps you to identify and take action on risk and coverage.
Collaboration across Research and Development teams is necessary for achieving optimal visibility and requires attention from the AppSec team.
You can display the analytics view by:
* Asset Class
* Application
* Owner
By selecting a View by dimension, all exposed widgets will be affected, enabling you to compare data points based on the selected dimension.
{% hint style="info" %}
The widgets display the top five applications or code owners based on the context. For instance, in the "open issues by control" widget, the top five applications or owners are chosen based on the total number of issues after applying the display views.\
You can also compare specific applications or owners by adding the application or owner display views.
{% endhint %}
Assets and applications vary in importance and sensitivity. Some repositories are internal and used for testing only, while others are public-facing and used in key services.
The dashboard default view compares assets and issues metrics by asset class. Display the view by dimension to see a comparison between applications or code owners throughout the dashboard.
#### Asset Class view
[Asset class](https://docs.snyk.io/manage-assets/assets-inventory-components#class) reflects the business criticality of the asset from A (most critical) to D (least critical).\
By having this level of visibility, you can prioritize the most crucial assets in your inventory, applications, or code owners.\
To associate assets with asset class, you can either change the asset class manually in the inventory screen or, preferably, define a [classification policy](https://docs.snyk.io/manage-risk/policies/assets-policies/use-cases-for-policies/classification-policy) that will automatically assign an asset class to your assets.
#### Applications and Owner view
You can filter the data from your Application Analytics dashboard based on application or code owner. To proceed, it is necessary to have the appropriate metadata available for the repositories. The metadata can be pulled directly from the Snyk SCM integration. You can find details of how to set this up on the [Backstage catalog in Asset Inventory](https://docs.snyk.io/developer-tools/scm-integrations/application-context-for-scm-integrations#backstage-file-for-scm-integrations) page.\
To determine if this metadata is available in your repositories, check the completeness widget for repository metadata. Snyk recommends verifying that all class A assets are properly configured by using the asset class filter from the dashboard.
### Analytics timeframe
You can select a specific date range for the assets analyzed data by adding the **Asset Introduction Date** filter. Applying that filter will impact all non-trend widgets, narrowing them down from showing all available data to data for assets introduced in the selected date range. The trend widgets are configured to show a fixed timeframe of the last three (3) months.
{% hint style="info" %}
The data in Application Analytics is updated on an daily basis.
{% endhint %}
The following video presents an overview of the Application Analytics filters and views from the Snyk Web UI.
{% embed url="" %}
Application Analytics - filters and views
{% endembed %}
## Data categories
The Application Analytics dashboard focuses on three main data categories:
* Coverage - provides the coverage status and the trends for the analyzed assets.
* Issues - provides the status of the open issues.
* Assets - provides the coverage status of the repository metadata and the status and trends for the imported assets.
### Coverage
One of the leading missions of an AppSec team is ensuring appropriate scan coverage across the asset inventory. A [covered asset](https://docs.snyk.io/discover-snyk/getting-started/glossary#coverage-snyk-essentials) is simply an asset that has been scanned by a certain application security testing (AST) product. Having uncovered assets expose the company to unknown risks, that is why it is essential to verify that business critical assets (based on asset class or strategic applications), are being properly scanned.
In the Coverage section, you have information about the assets coverage.
* Coverage overview - provides information, in percentages, about the scanned assets, distributed by the scan category (SAST, SCA, Container, and Secrets).
* Coverage trend - allows reviewing the coverage trend for the last three (3) months. A growing trend will indicate that a larger portion of your asset inventory was scanned.
The Coverage section is based on the scan category and is not impacted by the selected view ( Asset Class, Application, or Owner).
The Coverage Section
The following video presents an overview of the Application Analytics Coverage view from the Snyk Web UI.
{% embed url="" %}
Application Analytics Coverage view
{% endembed %}
### Issues
In the Issues section, you have information about the analyzed open issues.
* Open issues by category - This graphic provides a clear overview of the number of issues distributed by the issue source category (SAST, SCA, Container, and Secrets) and by the selected view (allowing to compare between asset classes, applications and owners).
* Open issues breakdown - This graphic provides information about the backlog of your open issues. The desired trend is a negative one, especially for higher asset classes or strategic applications. The selected view allows comparing asset classes, applications and owners.
You can choose to view the issues based on Asset Class, Application, or Owner. The focus of the presented information is changed based on your View by selection. When viewing by application or owner, only the top five (5) applications or owners with the most issues are displayed.
You can see more details about each graphic by hovering over the presented data. Extra controls are available on the right side of each graphic, allowing you to download it as an image.
The following video presents an overview of the Application Analytics Issues view from the Snyk Web UI.
{% embed url="" %}
Application Analytics Issues view
{% endembed %}
### Assets
In the Assets section, you have information about the analyzed assets.
* Risk factors breakdown **-** a funnel that shows the progression of risk factors on code repositories and container images. Each layer is divided according to the selected view, asset class, application, or owner. When viewing by application or owner, only the top five (5) applications or owners will be displayed according to the number of assets with risk factors.
* New assets introduced - allows tracking the inventory size over time. The trend only counts repositories and container images. When viewing by application or owner, only the top five (5) applications or owners with the most assets will be displayed.
You can choose to view the Assets section based on Asset Class, Application, or Owner. The focus of the presented information is changed based on your View by selection.
You can see more details about each graphic by hovering over the presented data. Extra controls are available on the right side of each graphic, allowing you to download it as an image.
The following video presents an overview of the Application Analytics Assets view from the Snyk Web UI.
{% embed url="" %}
Application Analytics Assets view
{% endembed %}
### Metadata completeness
The metadata completeness section provides information on the completeness of metadata from application context sources for your repositories.
* Repo metadata completeness - displays the availability of application context metadata across code repositories. For more information about context metadata, see [Application context for SCM Integrations](https://docs.snyk.io/developer-tools/scm-integrations/application-context-for-scm-integrations).
* Repository source distribution - provides information about the repositories distributed by the type of integration (SCM integrations, third-party integrations). When viewing by application or owner, only the top five (5) applications or owners with the most assets will be displayed.
The following video presents an overview of the Application Analytics Repository metadata completeness and source distribution view from the Snyk Web UI.
{% embed url="" %}
Analytics Repository metadata completeness and source distribution view
{% endembed %}
---
# Source: https://docs.snyk.io/developer-tools/scm-integrations/application-context-for-scm-integrations.md
# Application context for SCM integrations
## What is application context?
The application context for SCM integrations provides a comprehensive and interconnected overview of application assets. This context is crucial for assessing security risks and their potential implications, as it outlines the entire structure and components of the applications involved.
Use Application Context to integrate with Internal Developer Portals (IDPs) and service catalogs such as ServiceNow CMDB, Atlassian Compass, and others. These platforms allow Snyk to automate the collection of essential context, including asset type, ownership, and lifecycle.
The application context provides broader access to resources and services in an application. You can use it to:
* Prepare a comprehensive inventory of your application environment, ensuring it includes crucial metadata. This should encompass elements such as SCM topics and the associated developers.
* Gather relevant information to effectively assess and manage application security vulnerabilities and identify potential risks.
* Create a streamlined data flow by working cohesively with assets identified through Snyk Essentials SCM integrations
By leveraging Application context, you can achieve a deeper understanding of your application's security posture. After the integration is set, use the application context that can be leveraged across Snyk to classify repositories, set Asset policies, or filter reports.
These are the available integrations that you can set up for the application context:
* [Backstage file](#backstage-file-for-scm-integrations)
* [ServiceNow CMDB](#servicenow-cmdb-for-scm-integrations)
* [Atlassian Compass](#atlassian-compass)
* [Harness](#harness)
* [OpsLevel](#opslevel)
* [Datadog Service Catalog](#datadog-service-catalog)
{% hint style="info" %}
The Application Context integrations on this page work in conjunction with assets found through Snyk Essentials SCM integrations. If there is no Snyk Essentials SCM integration configured at the Group level on the Integrations page, then data will not populate from these integrations.
{% endhint %}
## Backstage file for SCM integrations
{% hint style="info" %}
**Release status**
The ackstage file integration is in Early Access and available with Snyk Enterprise plans.
{% endhint %}
Backstage is a service catalog that allows users to add metadata or annotations to their repositories, helping to organize and categorize the available resources for easier navigation and understanding. You can leverage your SCM integration to pull metadata associated with backstage catalog files into Snyk Essentials.
You can use the backstage catalog file for GitHub, GitLab, Azure DevOps, BitBucket Cloud, and BitBucket on-prem SCM integrations.
### Required parameters for the backstage file
* A configured SCM integration.
* The `catalog-info.yaml` file from your Project.
### Integration setup for the backstage file
1. Open the **Integrations** page.
2. Select an SCM integration.
3. Click the **Settings** option of the SCM integration.
4. Enable the **Add Backstage Catalog** option.
5. Optional - if the backstage catalog filename in your repository is not `catalog-info.yaml` you can change the default value in the backstage catalog filename field.
6. Select at least one attribute you want to add to Snyk Essentials.
{% hint style="info" %}
Snyk Essentials parses the fields of the detected file using the default field names unless an alternate field name is specified.
{% endhint %}
7. Click **Done**.
After you finish configuring the backstage catalog, Snyk Essentials starts enriching your repository assets with the data found in the backstage catalog .yaml file.
{% hint style="warning" %}
When you set up the catalog attributes, you must use the specific service-level attributes, for example `attribute.name.`
{% endhint %}
The following video presents an overview of the backstage file from the Snyk Web UI.
{% embed url="" %}
Application context with backstage Catalog for Snyk Essentials
{% endembed %}
## ServiceNow CMDB for SCM integrations
{% hint style="info" %}
**Release status**
The ServiceNow CMDB integration is in Early Access and available with Snyk Enterprise plans.
{% endhint %}
### Required parameters for ServiceNow CMDB
1. Add the **profile name** for your ServiceNow CMDB instance.
2. Setup the **CMDB instance** for the ServiceNow CMDB by following this example `https://.service-now.com`.
3. **Username** and **Password** - Credentials for your ServiceNow CMDB instance.
4. Add the **table name** for the CMDB configuration item class. Navigate to the [ServiceNow CMDB tables details](https://docs.servicenow.com/bundle/washingtondc-servicenow-platform/page/product/configuration-management/reference/cmdb-tables-details.html) page for the full list of names.
5. Add the **CMDB field to map Repo URL** - Add the URL of the repository.
{% hint style="info" %}
* The data gathered by Snyk from ServiceNow CMDB will be correlated with the Repository Assets.
* The ServiceNow CMDB integration uses basic authentication and suggests enabling the "Web service access only" option for Service Accounts.
{% endhint %}
### Integration setup for ServiceNow CMDB
* Open the **Integrations** page.
* Select the **App Context** tag and search for ServiceNow CMDB.
* Click **Add** .
* Add the **Profile name** - this is the name of your ServiceNow CMDB profile.
* Add the **CMDB Instance** - this is your ServiceNow instance, use this format: `https://.service-now.com`
* Add the **Username** and the **Password**- the username and password to access the ServiceNow CMDB instance
* Add the **Table name** - select the configuration item class that Snyk Essentials should onboard. Use this format `cmdb_ci_`
* Add the **CMDB Field to map Repo URL** - the specific URL that is being referred to in the ServiceNow CMDB record.
* You can select one or more attributes related to repository assets and configure where Snyk Essentials can take this attribute in ServiceNow CMDB. Example:
* Category: application\_type
* Owner: business\_unit
* Click **Done**.
* When the connection is established, the status of the ServiceNow CMDB integration is changed to **Connected**.
{% hint style="warning" %}
When you set up the catalog attributes, you can customize the name of the attribute but must ensure that the same name is used in the catalog and in the Integration setup.
{% endhint %}
The following videos present an overview of the ServiceNow CMDB from the Snyk Web UI.
{% embed url="" %}
Application context with ServiceNow CMDB for Snyk Essentials - Part 1
{% endembed %}
{% embed url="" %}
Application context with ServiceNow CMDB for Snyk Essentials - Part 2
{% endembed %}
## Atlassian Compass
{% hint style="info" %}
**Release status**
The Atlassian Compass integration is in Early Access and available with Snyk Enterprise plans.
{% endhint %}
### Required Parameters for Atlassian Compass
1. Add your Atlassian Compass **Profile name**.
2. Add your Atlassian Compass **Instance URL**. You can use this format type: `https://.atlassian.net`.
3. Add your Atlassian Compass **Username**.
4. Add your Atlassian Compass instance **Token**. Navigate to the [Manage API tokens for your Atlassian account](https://support.atlassian.com/atlassian-account/docs/manage-api-tokens-for-your-atlassian-account/) page for more details about creating an Atlassian API token.
{% hint style="info" %}
The gathered data from Atlassian Compass will be correlated with the Repository Assets.
This feature is available only for the integration with Atlassian Compass.
{% endhint %}
### Integration setup for Atlassian Compass
1. Open the **Integrations** page.
2. Select the **App Context** tag and search for Atlassian Compass.
3. Click **Add**.
4. Add the **Profile name** - this is the name of your Atlassian Compass profile.
5. Add the **Instance URL** - this is the URL of the Atlassian Compass instance. Use this format type: `https://.atlassian.net`
6. Add the **Username** - this is the username to access the Atlassian Compass instance.
7. Add the **Token** - this is the API token to access the Atlassian Compass instance.
8. You can select one or more attributes related to repository assets that Snyk Essentials can pull from Atlassian Compass based on the [Component Data](https://developer.atlassian.com/cloud/compass/forge-graphql-toolkit/Interfaces/Component/):
* **Catalog Name** - Matches with name.
* **Category** - Identified when '`fields.definition.name`' equals tier.
* **Lifecycle** - Identified when '`fields.definition.name`' equals lifecycle.
* **Owner** - the `ownerId` (finding owner name from ownerId).
* **Application** - the `typeId` (all component types, Application, Service, Library, and so on receive an ID).
9. Click **Done**.
10. When the connection is established, the status of the Atlassian Compass integration is changed to **Connected**, and Snyk Essentials will start enriching repository assets with the data found in Atlassian Compass.
{% hint style="warning" %}
When you set up the catalog attributes, you must use the specific service-level attributes, for example `attribute.name.`
{% endhint %}
## Harness
{% hint style="info" %}
**Release status**
The Harness integration is in Early Access and available with Snyk Enterprise plans.
{% endhint %}
### Required Parameters for Harness
1. Add your Harness **Profile name**.
2. Add the **Host URL** of your Harness account. You can use this format type: `https://.harness.io`
3. Add the **API key** for your Harness instance. You can use the Harness [Add and manage your API keys](https://developer.harness.io/docs/platform/automation/api/add-and-manage-api-keys/) documentation page to manage your API key.
{% hint style="info" %}
This integration is focused on [Harness’s](https://developer.harness.io/docs/internal-developer-portal/catalog/software-catalog/) service catalog module and it is backed by the backstage catalog.
{% endhint %}
### Integration setup for Harness
1. Open the **Integrations** page.
2. Select the **App Context** tag and search for Harness.
3. Click **Add**.
4. Add the **Profile name** - this is the name of your Harness instance.
5. Add the **Host URL** of your Harness account.
6. Add the **API key** of your Harness instance.
7. Select at least one Harness software catalog [metadata](https://developer.harness.io/docs/internal-developer-portal/catalog/software-catalog#component-definition-yaml):
* Catalog name - If you select this metadata, it is mandatory to add the **Catalog name key**.
* Title - If you select this metadata, it is mandatory to add the **Title key**.
* Category - If you select this metadata, it is mandatory to add the **Category key**.
* Lifecycle - If you select this metadata, it is mandatory to add the **Lifecycle key**.
* Owner - If you select this metadata, it is mandatory to add the **Owner key**.
* Application - If you select this metadata, it is mandatory to add the **Application key**.
8. Click **Done**.
9. When the connection is established, the status of the Harness integration is changed to **Connected**, and Snyk Essentials will start enriching repository assets with the data found in Harness.
{% hint style="warning" %}
When you set up the catalog attributes, you can customize the name of the attribute but must ensure that the same name is used in the catalog and in the Integration setup.
{% endhint %}
## OpsLevel
{% hint style="info" %}
**Release status**
The OpsLevel integration is in Early Access and available with Snyk Enterprise plans.
{% endhint %}
### Required Parameters for OpsLevel
1. Add your OpsLevel **Profile name**.
2. Add the **Instance URL** of your OpsLevel account. You can use this format type: `https://.opslevel.com`
3. Add the **API Token** for your OpsLevel instance. To create an API Token in your OpsLevel account, use the instructions on the OpsLevel [Create an API token](https://docs.opslevel.com/docs/graphql#1-create-an-api-token) documentation page.
### Integration setup for OpsLevel
1. Open the **Integrations** page.
2. Select the **App Context** tag and search for OpsLevel.
3. Click **Add**.
4. Add the **Profile name** - this is the name of your OpsLevel instance.
5. Add the **Instance URL** of your OpsLevel account.
6. Add the **API Token** for your OpsLevel instance.
7. You can select one or more attributes related to repository assets that Snyk Essentials can pull from OpsLevel with the following mapping:
* Catalog name - Identified with `name` in OpsLevel.
* Category - Identified with `tier.name` in OpsLevel.
* Lifecycle - Identified with `lifecycle.name` in OpsLevel.
* Owner - Identified with `owner.name` in OpsLevel.
* Application - Identified with `product` in OpsLevel.
8. Click **Done**.
9. When the connection is established, the status of the OpsLevel integration is changed to **Connected**, and Snyk Essentials will start enriching repository assets with the data found in OpsLevel.
{% hint style="warning" %}
When you set up the catalog attributes, you must use the specific service-level attributes, for example `attribute.name.`
{% endhint %}
## Datadog Service Catalog
{% hint style="info" %}
**Release status**
The Datadog Service Catalog integration is in Early Access and available with Snyk Enterprise plans.
{% endhint %}
### Required Parameters for Datadog Service Catalog
1. Add your Datadog **Profile name**.
2. Add the **API key** for the Datadog instance. Your token should have the following scope permissions: `apm_service_catalog_read`.
3. Add the **Application Key** along with your organization's API key to grant users access to Datadog's programmatic API. For more details, access the [Datadog API and Application key](https://docs.datadoghq.com/account_management/api-app-keys/) documentation page.
### Integration setup for Datadog Service Catalog
1. Open the **Integrations** page.
2. Select the **App Context** tag and search for **Datadog Service Catalog**.
3. Click **Add**.
4. Add the **Profile name** - this is the name of your Datadog instance.
5. Add the **API key** for your Datadog instance.
6. Add the **Application key** for your Datadog instance.
7. Add the details of your **Datadog site**.
8. You can select one or more attributes related to repository assets that Snyk Essentials can pull from Datadog Service Catalog with the following mapping:
* Catalog name - If you select this metadata, it is mandatory to add the **Catalog name key**.
* Title - If you select this metadata, it is mandatory to add the **Title key**.
* Category - If you select this metadata, it is mandatory to add the **Category key**.
* Lifecycle - If you select this metadata, it is mandatory to add the **Lifecycle key**.
* Owner - If you select this metadata, it is mandatory to add the **Owner key**.
* Application - If you select this metadata, it is mandatory to add the **Application key**.
9. Click **Done**.
10. When the connection is established, the status of the Datadog Service Catalog integration is changed to **Connected**, and Snyk Essentials will start enriching repository assets collected by a Snyk Essentials SCM Integration with the data found in Datadog Service Catalog.
{% hint style="warning" %}
When you set up the catalog attributes, you can customize the name of the attribute but must ensure that the same name is used in the catalog and in the Integration setup.
{% endhint %}
---
# Source: https://docs.snyk.io/snyk-platform-administration/user-roles/custom-role-templates/application-security-engineer-role-template.md
# Application Security Engineer role template
This Organization-level role can add, move, and remove Projects and ignores, and can mark PR checks as successful.
## Group-level permissions
This template is for an Organization-level role and has no Group-level permissions.
## Organization-level permissions
To create this role, enable the following permissions in the relevant categories:
### Organization management
Permission Enabled? View Organization true Edit Organization false Remove Organization false
### Project management
Permission Enabled? View Project true Add Project true Edit Project true Edit Project status true Test Project true Move Project true Remove Project false View Project history true Edit Project integrations true Edit Project attributes true View Jira issues true Create Jira issues true Edit Project Tags true
### Project Ignore management
Permission Enabled? View Project Ignores true Create Project Ignores true Edit Project Ignores true Remove Project Ignores true
### Project pull request management
Permission Enabled? Create pull requests false Mark pull request checks as successful true
The remaining categories of permissions listed below should have all permissions within them set to disabled:
* Audit Log management
* Billing management
* Collection management
* Container Image management
* Entitlement management
* Integration management
* Kubernetes integration management
* Package management
* Reports management
* Service account management
* Snyk Apps management
* Snyk Cloud management
* Snyk Preview management
* User management
* Webhook management
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-container/how-snyk-container-works/application-vulnerabilities-in-snyk-container-and-snyk-open-source.md
# Application vulnerabilities in Snyk Container and Snyk Open Source
Snyk Container detects application vulnerabilities in your containers and overlaps Snyk Open Source capabilities.\
\
The results from a Snyk Container application vulnerability scan and a Snyk Open Source scan are generally the same, especially if Snyk is building a dependency graph from the same manifest files.\
\
However, results can vary significantly depending on the ecosystem and how the developer builds the application. An application in a container is a compiled application. So, in some ecosystems, Snyk Open Source can scan a more detailed manifest and thus build a more accurate dependency graph:
* `golang` Projects for Snyk Containers: Snyk does not have access to the list of dependencies as in Snyk Open Source. Therefore, Snyk Container reverse parses binaries, and the result differs slightly from Snyk Open Source.
* `npm` packages as Snyk Containers: Snyk can access the list of dependencies. The result is generally the same as that of Snyk Open Source. For details, see [Support for npm](https://docs.snyk.io/supported-languages/supported-languages-list/javascript#support-for-npm).
* `java` applications for Snyk Containers: In Open Source, it is possible to include unmanaged jars (see [Scan all unmanaged jar files](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/scan-all-unmanaged-jar-files)). Thus the result is different from Snyk Container.
* With Snyk Container, the scan traverses all the jars Snyk finds in the image (see [Detect application vulnerabilities in container images](https://docs.snyk.io/scan-with-snyk/snyk-container/use-snyk-container/detect-application-vulnerabilities-in-container-images)). In addition, there are multiple ways to build a jar, and the method used affects how Snyk Container finds the dependencies.
* In Snyk Open Source, if there are multiple potential versions of a dependency, the package manager dependency resolution logic ensures that only one version is selected. However, in Snyk Container, unpacked jars may contain other versions of dependencies, and because they all exist in the container, they are all reported.
---
# Source: https://docs.snyk.io/scan-with-snyk/pull-requests/snyk-pull-or-merge-requests/customize-pr-templates/apply-a-custom-pr-template.md
# Apply a custom PR template
## Create and manage a custom PR template using the API
You can create a custom PR template using the API endpoint [Create or update pull request template for Group](https://docs.snyk.io/snyk-api/reference/pull-request-templates#post-groups-group_id-settings-pull_request_template). Send an API request that contains a JSON payload with the custom properties. This request configures a Group-level pull request template that will be used on any Organization or Project within that Group. The pull request template created using the Snyk API can be updated at any time, and all Projects in the Group are automatically updated with the latest changes.
API configuration of PR templates is available only at the Group level.
When a custom template is uploaded using an API request, all Snyk PRs in that Group adopt this format, effectively switching off the default Snyk template for the customizable properties. Strings are the only acceptable values; lists and numbers are not allowed.
If any customizable properties are missing from your custom template, Snyk reverts to the default values for these properties when opening a pull request.
The following properties are customizable using the API:
* `title` - customize the PR title
* `commit_message` - customize the PR commit message
* `description` - customize the PR description
You cannot customize the branch name for your PRs. The branch name of your PRs will use the Snyk default value.
You can retrieve the custom PR template for your Group using the endpoint [Get pull request template for Group](https://apidocs.snyk.io/?#get-/groups/-group_id-/settings/pull_request_template). This is useful if you want to consider changing your template, and in troubleshooting.
To delete the template, use the endpoint [Delete pull request template for group](https://docs.snyk.io/snyk-api/reference/pull-request-templates#delete-groups-group_id-settings-pull_request_template).
## Customize using a YAML PR template file
### Create the YAML file
Manually create the YAML template by using the [mustache](https://mustache.github.io) syntax for templating and add the file to your Project or repository.
When a custom template is uploaded to your Project, all PRs from Snyk for the Project adopt this format, effectively switching off the default Snyk template for the customized properties. Strings are the only acceptable values; lists and numbers are not allowed. If any customizable properties are missing from your template, Snyk reverts to the default values for these properties when opening a pull request.
#### YAML multiline operators
You can use YAML multiline operators. You can create a detailed description that spans several lines by following this format:
```yaml
description: |
This pull request comes from Snyk
For more information see the project page {{ snyk_project_url }}
If you have more questions reach out to a member of the security team
```
The pipe operator preserves new line characters. Use greater than, `>` , to join all the lines by a space with a new line at the end. To use a colon, you can either use multiline operators, `|` or `>`, or enclose the line in double quotes:
```yaml
commitMessage: "snyk: this is a security pull request"
```
#### Customizable properties for YAML
The following properties are customizable:
* `title` - customize the PR title
* `commitMessage` - customize the PR commit message
* `description` - customize the PR description
You cannot customize the branch name for your PRs. The branch name of your PRs will use the Snyk default value.
### Use the YAML custom PR template
You can manually upload the YAML file with the name `snyk_pull_request_template.yaml` to your Project (repository). The method varies based on the type of integration.
* GitHub/ GitHub Enterprise - `/.github/snyk_pull_request_template.yaml`
* GitLab - `/.gitlab/snyk_pull_request_template.yaml`
* Azure DevOps - `/.azuredevops/snyk_pull_request_template.yaml`
* Other (such as BitBucket) - `/.config/snyk_pull_request_template.yaml`
If you want to use a custom template for multiple repositories, add the YAML custom template file to each of these repositories.
## Broker configuration for fetching custom PR templates
If you use [Snyk Broker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker), you must use a Broker at version 4.188.0 or higher and enable the Broker to fetch the custom PR templates using the `ACCEPT_CUSTOM_PR_TEMPLATES` environment variable.
To do this, you must remove `ACCEPT=/path/to/custom.json` and add the following environment variable to your Broker container or deployment:
```
ACCEPT_CUSTOM_PR_TEMPLATES=true
```
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-implementation-guide/phase-2-configure-account/apply-security-and-license-policies.md
# Apply security and license policies
Policies define how Snyk behaves when identifying issues. Policies give you a quick and automated way to identify, prioritize, and triage issues. This saves valuable development time and allows developers to take more responsibility and ownership for security, reducing the “noise” level.
See [Policies](https://docs.snyk.io/manage-risk/policies) for more details.
## Security policies
Group administrators can define security policies, thus providing an automated way to identify certain issues or types of issues, and apply actions like changing the severity or ignoring the issue based on your conditions.
* Configure policies to increase priority or decrease it as needed.
* Create ignores where needed
See [Security policies](https://docs.snyk.io/manage-risk/policies/security-policies) for more details.
## License policies
Group administrators can set license policies to define Snyk behavior for treating license issues. For example, you can allow or disallow packages with certain license types, to avoid using packages containing incompatible licenses.
By default, Snyk determines the severity of licenses in the following way:
* High severity - licenses that definitely present issues for commercial software.
* Medium severity - licenses that have clauses that may be of concern and should be reviewed.
Configure policies to match your requirements.
See [Snyk License Compliance Management](https://docs.snyk.io/scan-with-snyk/snyk-open-source/scan-open-source-libraries-and-licenses/snyk-license-compliance-management) for more details.
## Asset policies
Policies can be created using the Policy Editor to:
* Notify user(s) using Slack or email when a condition is met
* create a Jira ticket
* Set classification using policy
* Set tags using policy
* Specify coverage policies, for example, scan not performed within a specified number of days
* For more information, navigate to the [Assets policies](https://docs.snyk.io/manage-risk/policies/assets-policies) page.
---
# Source: https://docs.snyk.io/snyk-api/reference/apps.md
# Apps
{% hint style="info" %}
This document uses the REST API. For more details, see the [Authentication for API](https://docs.snyk.io/snyk-api/authentication-for-api) page.
{% endhint %}
{% openapi src="" path="/self/apps" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/self/apps/{app\_id}" method="delete" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/self/apps/{app\_id}/sessions" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/self/apps/{app\_id}/sessions/{session\_id}" method="delete" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/self/apps/installs" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/self/apps/installs/{install\_id}" method="delete" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps" method="post" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/{client\_id}" method="patch" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/{client\_id}" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/{client\_id}" method="delete" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/{client\_id}/secrets" method="post" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/installs" method="post" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/installs" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/installs/{install\_id}" method="delete" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/installs/{install\_id}/secrets" method="post" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/creations" method="post" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/creations" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/creations/{app\_id}" method="patch" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/creations/{app\_id}" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/creations/{app\_id}" method="delete" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/apps/creations/{app\_id}/secrets" method="post" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/app\_bots" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/orgs/{org\_id}/app\_bots/{bot\_id}" method="delete" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/groups/{group\_id}/apps/installs" method="post" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/groups/{group\_id}/apps/installs" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/groups/{group\_id}/apps/installs/{install\_id}" method="delete" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/groups/{group\_id}/apps/installs/{install\_id}/secrets" method="post" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/snyk-cli-for-iac/test-your-iac-files/arm-files.md
# ARM files
With Snyk Infrastructure as Code, you can test your configuration files using the CLI.
Snyk Infrastructure as Code for Azure Resource Manager (ARM) supports scanning JSON format files. You can also scan Bicep format files by converting the configuration files to JSON using the Bicep CLI. Snyk supports the ARM `languageversion` 1.0.
## Test for an issue on specified JSON files
Enter the following Snyk CLI command:
```
snyk iac test deploy.json
```
You can also specify multiple files by appending the file names after each other, for example:
```
snyk iac test file-1.json file-2.json
```
## Test for an issue on specified Bicep files
Be sure you have the [Bicep CLI installed](https://docs.microsoft.com/en-us/azure/azure-resource-manager/bicep/install).
After installing the Bicep CLI, **navigate** to the directory that contains your Bicep files and **convert** the relevant Bicep file to JSON by entering the following:
```
az bicep build -f deploy.bicep
```
You can then scan the newly created JSON file in the same way as any other file. Use the following Snyk CLI command:
```
snyk iac test deploy.json
```
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-open-source/manage-vulnerabilities/artifactory-gatekeeper-plugin.md
# Artifactory Gatekeeper Plugin
{% hint style="info" %}
**Feature availability**\
This feature is available only with Enterprise plans. For more information, see [plans and pricing](https://snyk.io/plans/).
{% endhint %}
With the Snyk plugin for Artifactory, you can scan your artifacts for open-source vulnerabilities and license issues.
After the plugin is installed, it runs in the background and can do the following automatically:
* Add vulnerability and license issue counts from Snyk as properties in an artifact
* Block developers from downloading packages with vulnerability and license issues according to a configured threshold
By scanning artifacts as part of your workflow and then displaying those test results directly from the Artifactory UI, the Snyk Artifactory Gatekeeper Plugin enables you to track and identify issues that are risks to your application security more quickly and to avoid using those artifacts in your Projects.
{% hint style="info" %}
This page refers to the Artifactory Plugin, an independent piece of software that is installed on the Artifactory machine and serves as a gatekeeper, blocking vulnerable packages from being downloaded from the Artifactory instance.
This plugin is separate from the [Artifactory Registry for Maven](https://docs.snyk.io/scan-with-snyk/snyk-open-source/package-repository-integrations/artifactory-package-repository-connection-setup/artifactory-registry-for-maven), a Snyk integration that allows configuring SCM scans to use custom package registries.
{% endhint %}
## Package managers supported by the Artifactory Plugin
* npm
* Maven (.jar files)
* Gradle (.jar files)
* sbt (.jar files)
* pip (PyPi)
* CocoaPods
* Ruby Gems
* Nuget
## Prerequisites for the Artifactory plugin
* Snyk Enterprise Account
* Artifactory Version 7.4.3 and higher
## Data exchanged between Artifactory and Snyk
Artifactory transmits the package name and version to the test endpoint at the Snyk API instance. In the header, the authorization token is transmitted.
If the Artifactory installation is configured to use a proxy, Snyk will automatically use it too. Potentially, there could be an issue if the proxy is an authenticated or Kerberos proxy, but a standard, unauthenticated, forwarding proxy should work if the Artifactory installation and its underlying JVM are configured correctly with a proxy.
Snyk calls [`https://api.snyk.io/v1/test`](https://api.snyk.io/v1/test) for the right packager manager with the right name and version.
## Installation of the Artifactory Plugin
1. Log in to your Snyk account.
2. Select **Settings** > **General** to locate, copy, and save the following:
1. Service account token or Organization API token
2. The Organization ID for any one of your organizations
3. Navigate to [the Snyk Artifactory plugin repo in GitHub](https://github.com/snyk/artifactory-snyk-security-plugin) and then to **Releases**.
4. From the most current release, expand the **Assets** section to download the artifactory-snyk-security-plugin-\.zip archive.
5. Extract the archive. It should have the following structure: `plugins (directory)` followed by
* `snykSecurityPlugin.groovy — plugin`
* `snykSecurityPlugin.properties — plugin configuration`
* `lib (directory)`
* `artifactory-snyk-security-core.jar - plugin library`
* `snykSecurityPlugin.version - plugin version`
6. Open `snykSecurityPlugin.properties` in a text editor.
1. Set the API Token and Organization ID from the earlier steps as `snyk.api.token` and `snyk.api.organization` respectively.
2. Configure the rest of the properties as needed or leave them as defaults. See the section [Plugin configuration](#plugin-configuration).
3. For a full list of properties, [view the properties file on GitHub](https://github.com/snyk/artifactory-snyk-security-plugin/blob/master/core/src/main/groovy/io/snyk/plugins/artifactory/snykSecurityPlugin.properties).
7. Place all the files under `$JFROG_HOME/artifactory/etc/artifactory/plugins`.
8. Restart your Artifactory server. Note that **Refresh now** or **Reload** is not sufficient. Artifactory must be restarted.
9. Log in to your Artifactory instance and navigate to the **System Logs** to check that Snyk has been installed successfully.
Successful installation of Snyk
## How the Artifactory plugin works
Whenever a download is requested from an Artifactory remote repository, whether from a package manager or a URL, Snyk automatically scans the artifact for vulnerabilities and license issues. The issues found by Snyk are persisted as artifact properties. Access to the package is later controlled according to these properties, and in line with severity thresholds set in the plugin config.
Depending on the configuration, the plugin can periodically re-scan packages to keep the issue metadata up to date.
The plugin only works with remote repositories. It does not scan locally stored artifacts, but instead queries the Snyk API with the meta-information about the artifacts. Therefore, only published artifacts of the proxied remote repository can be examined by the Snyk Artifactory plugin.
To view details about the download status, open the **System Logs**.
If a scan finds issues, based on your configuration, the download request can be blocked with an HTTP status code "403 Forbidden".
You can find the results of a scan under the artifact properties, where you can decide to ignore the issues and allow downloads. To find the artifact, use the Artifactory search bar or navigate the **t**ree view.
Results of a scan
## Plugin configuration
Plugin configuration is loaded from this file: `$JFROG_HOME/artifactory/etc/artifactory/plugins/snykSecurityPlugin.properties`. For changes in this file to take effect, Artifactory must be restarted.
Parameter Default value Description snyk.api.tokenService account token or Organization API token snyk.api.organizationThe Organization ID for any one of your Snyk organizations snyk.api.urlhttps://api.snyk.io/v1/ Snyk API base URL snyk.scanner.test.continuouslyfalse Decides whether the plugin should periodically refresh vulnerability data from Snyk or filter access according to results obtained while the package was first requested. Without the continuous mode, new vulnerabilities aren't reported for a package that has already been allowed through the gatekeeper. snyk.scanner.frequency.hours168 Scan result expiry (continuous mode only). When the most recent scan was made within this time frame, filtering respects the previous result. Beyond that time, a new Snyk Test request is made. When this property is set to 0, the plugin triggers a test each time an artifact is accessed. snyk.scanner.vulnerability.thresholdlow Global threshold for vulnerability issues. Accepted values: "low", "medium", "high", "critical", "none" snyk.scanner.license.thresholdlow Global threshold for vulnerability issues. Accepted values: "low", "medium", "high", "critical", "none"
For a full list of properties, [view the properties file on GitHub](https://github.com/snyk/artifactory-snyk-security-plugin/blob/master/core/src/main/groovy/io/snyk/plugins/artifactory/snykSecurityPlugin.properties).
## Artifact properties
These are the properties set by the plugin on scanned artifacts. Artifact access is allowed or forbidden depending on the values of these properties.
Property Description snyk.test.timestampDate and time when the artifact wast last scanned by Snyk. snyk.issue.urlThis is the URL to the Snyk database and explanation of the vulnerability, including specific details about vulnerable versions, available upgrades, and Snyk patches. snyk.issue.vulnerabilitiesRegardless of the thresholds configured, this row displays vulnerability summary scan results. snyk.issue.vulnerabilities.forceDownloadWhen true, allows downloads for this artifact even when there are vulnerabilities. snyk.issue.vulnerabilities.forceDownload.infoUse this field to provide additional information about why the forceDownload is enabled. snyk.issue.licensesRegardless of the thresholds configured, this row displays license summary scan results. snyk.issue.licenses.forceDownloadWhen true, allows downloads for this artifact even when there are license issues. snyk.issue.licenses.forceDownload.infoUse this field to provide additional information about why the forceDownload is enabled.
## Troubleshooting for the Artifactory Gatekeeper Plugin
You can enable debug logs by modifying your `${ARTIFACTORY_HOME}/var/etc/artifactory/logback.xml`file and adding the following line:
```
```
Artifactory automatically picks up the new configuration. If this does not happen, restart Artifactory.
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-open-source/package-repository-integrations/artifactory-package-repository-connection-setup.md
# Artifactory package repository connection setup
{% hint style="info" %}
**Feature availability**\
Package repository integrations are available only with Enterprise plans. For more information, see [plans and pricing](https://snyk.io/plans/).
**Supported projects**\
The Artifactory Package Repository integration supports [Node.js](https://docs.snyk.io/supported-languages/supported-languages-list/javascript#supported-package-managers-and-package-registries) (npm and Yarn) and [Maven](https://docs.snyk.io/supported-languages/supported-languages-list/java-and-kotlin#supported-package-managers-and-package-registries) Projects. For [Improved Gradle SCM scanning](https://docs.snyk.io/supported-languages/supported-languages-list/java-and-kotlin/git-repositories-with-maven-and-gradle#improved-gradle-scm-scanning), use the Maven settings.
{% endhint %}
Connecting a custom Artifactory Package Repository enables Snyk to resolve all direct and transitive dependencies of packages hosted on the custom registry and calculate a more complete, accurate dependency graph and related vulnerabilities.
You can configure these types of Artifactory Package Repository:
* Publicly accessible instances protected by basic authentication
* Instances on a private network by using Snyk Broker (with or without basic authentication).
These instructions apply to configuring publicly accessible instances. For instructions on configuring a brokered instance, see the [setup instructions for Snyk Broker with Artifactory Repository](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/artifactory-repository-install-and-configure-broker).
The steps to set up Artifactory Repository Manager follow.
1. Navigate to **Settings** > **Integrations** > **Package Repositories** > **Artifactory**.
2. Enter the URL of your Artifactory instance; this must end with `/artifactory`.
3. Enter your username and password.
4. Select **Save**.
Artifactory repository setup
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-open-source/package-repository-integrations/artifactory-package-repository-connection-setup/artifactory-registry-for-maven.md
# Artifactory registry for Maven
{% hint style="info" %}
**Feature availability**\
Package repository integrations are available only with Enterprise plans. For more information, see [plans and pricing](https://snyk.io/plans/).
**Supported projects**\
The Artifactory Package Repository integration supports [Node.js](https://docs.snyk.io/supported-languages/supported-languages-list/javascript#supported-package-managers-and-package-registries) (npm and Yarn) and [Maven](https://docs.snyk.io/supported-languages/supported-languages-list/java-and-kotlin#supported-package-managers-and-package-registries) Projects. For [Improved Gradle SCM scanning](https://docs.snyk.io/supported-languages/supported-languages-list/java-and-kotlin/git-repositories-with-maven-and-gradle#improved-gradle-scm-scanning), use the Maven settings on this page.
{% endhint %}
Snyk can use custom Artifactory Package Repositories with Maven Projects.
This enables Snyk to resolve all direct and transitive dependencies of packages hosted on the custom registry and calculate a more complete, accurate dependency graph and related vulnerabilities.
Maven Projects can be configured to mirror all requests through a custom package repository, or you can specify additional repositories to use alongside Maven Central.
## **Set up custom Maven package registries**
If authentication is required for access to your custom registry, you must configure the Artifactory package repository integration; see [Artifactory Package Repository connection setup](https://docs.snyk.io/scan-with-snyk/snyk-open-source/package-repository-integrations/artifactory-package-repository-connection-setup).
After the integration is set up, you can configure Maven settings by navigating to **Settings** > **Languages** > **Java**.
You can choose whether to use Artifactory as a mirror or as an additional repository where your artifacts will reside. These settings will be very similar to what you have in `~/.m2/settings.xml`.
## Mirrors
Maven settings, choose Type
Choose a value for the Type, either **Direct** or, if you are using authentication, **Integration.**
If you are using **Direct,** you must complete the **URL**, **Repository Name,** and what it is a **Mirror Of**.
The **Mirror Of** value can either be a `*` to mirror everything, or you can type in a value, for example, `central`.
If you are using the Type **Integration**, you must choose an integration type and provide the **Repository Name** and **Mirror Of** details.
Set the **Repository Name** as whatever comes after `artifactory/` in the internal repository URL.
For example, if the URL is `http://artifactory.company.io/artifactory/libs-release` **Repository Name** should be set as `libs-release`.
## **Additional repositories**
Alternatively, you can configure repositories that will be used as additional locations to check for artifacts.
Repositories are configured in the same way as [Mirrors](#mirrors) but do not require **Mirror Of**.
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-open-source/package-repository-integrations/artifactory-package-repository-connection-setup/artifactory-registry-for-npm.md
# Artifactory registry for npm
{% hint style="info" %}
**Feature availability**
Package repository integrations are available only with Enterprise plans. For more information, see [plans and pricing](https://snyk.io/plans/).
This guide is relevant for Snyk UI integrations only. The CLI supports Yarn and npm Projects with private Artifactory registries.
{% endhint %}
Snyk can use Artifactory Package Repositories with npm and Yarn Projects. This enables Snyk to regenerate lockfiles with the correct URLs when creating Pull/Merge Requests.
You can add configuration to tell Snyk where your private Artifactory Node.js packages are hosted and under what scope. This is the same information you would normally add in your `.yarnrc` or `.npmrc`
## JavaScript language settings
1. Navigate to **Settings** > **Languages** > **JavaScript** and either the npm or Yarn settings, depending on your Project type.
2. If you have not previously connected to Artifactory, you will be asked to configure an integration first; see [Artifactory Package Repository connection setup](https://docs.snyk.io/scan-with-snyk/snyk-open-source/package-repository-integrations/artifactory-package-repository-connection-setup).
3. Select **Add registry configuration**.
1. Select **Artifactory** as the Package source.
2. If you want to configure this registry as the default registry url, leave the scope blank.
3. If you want to configure only scoped packages to use this registry, add a scope. For example, `@snyk` which would use the configured registry for all deps prefixed or scoped with `@snyk.`
4. If you want to add a mix of default registry url and scoped packages, add multiple configurations, one for the default and one per scope.
4. When you have added all the registries and scopes you want, click **Update settings**.
## Test the integration
Open a Pull/Merge Request on a Project that contains private dependencies that are hosted in Artifactory to see a lockfile updated and included in the Snyk Fix Pull Request with the correct URL to your repository.
Pull request to test Artifactory integration
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/artifactory-repository-install-and-configure-broker/artifactory-repository-environment-variables-for-snyk-broker.md
# Artifactory Repository - environment variables for Snyk Broker
The following environment variables are needed to customize the Broker Client for Artifactory Repository:
`BROKER_TOKEN` - the Snyk Broker token, obtained from your Artifactory integration settings (**Integrations** > **Artifactory**).
`BROKER_SERVER_URL` - the URL of the Broker server for the region in which your data is hosted. For the commands and URLs to use, see [Broker URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#broker-server-urls).
`ARTIFACTORY_URL` - the URL of your Artifactory deployment, such as `.artifactory.com/artifactory`.
The following fields are optional:
* `Port`: Omit if no port number is needed.
* `Basic auth`: Omit if no basic auth required.\
URL encode both username and password info to avoid errors that may prevent authentication.
* `Protocol`: Defaults to `https://`\
This should only be specified when no certificate is present and `http://` is required instead for your instance.
`ARTIFACTORY_URL` format with optional fields:\
`[http://][username:password@]hostname[:port]/artifactory`\
Example:\
`http://alice:mypassword@acme.com:8080/artifactory`
Optional. `RES_BODY_URL_SUB` - The URL of the Artifactory instance, including http\:// and without basic auth credentials. Required for npm and Yarn integrations only.\
Example:\
`http://acme.com/artifactory`
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/artifactory-repository-install-and-configure-broker.md
# Artifactory Repository - prerequisites and steps to install and configure Broker
{% hint style="info" %}
**Feature availability**
Integration with Artifactory Repository is available only for Enterprise plans. For more information, see [plans and pricing](https://snyk.io/plans/).
{% endhint %}
Before installing, review the general instructions for the installation method you plan to use, [Helm](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/install-and-configure-broker-using-helm) or [Docker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/install-and-configure-broker-using-docker).
The prerequisites follow.
Before installing the Snyk Artifactory Repository Broker, ask your Snyk account team to provide you with a Broker token or generate it from the Snyk Web UI.
You must have Docker or a way to run Docker Linux containers. Some Docker deployments for Windows run only Windows containers. Ensure that your deployment is capable of running Linux containers.
For convenience, instructions to obtain or generate the Broker token follow. When you are done, continue with the steps to install using [Docker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/artifactory-repository-install-and-configure-broker/artifactory-repository-install-and-configure-using-docker) or [Helm](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/artifactory-repository-install-and-configure-broker/artifactory-repository-install-and-configure-using-helm).
## Obtain Broker token for Artifactory Repository setup
1. Navigate to **Settings** > **Add integrations** > **Package Repositories > Artifactory**.
2. Enter the URL of your Artifactory instance, this must end with /artifactory.
3. Enter your username and password.
4. Select **Save**.
Artifactoryrepository setup
{% hint style="info" %}
If you do not see the Snyk Broker on/off switch, you do not have the necessary permissions and can only add a publicly accessible instance.
Submit a request to [Snyk Support](https://support.snyk.io) if you want to add a private registry.
{% endhint %}
When you have the permissions needed to add a private registry, continue with the instructions to [generate a Broker token from the Web UI](#generate-a-broker-token-from-the-web-ui).
## Generate a Broker token from the Web UI
1. In the Artifactory integration settings, move the Snyk Broker on/off switch to **on** to display a form for generating a Broker token.
2. Select **Generate and Save.**
3. Copy the token that was generated to use when you set up the Broker Client.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/artifactory-repository-install-and-configure-broker/artifactory-repository-install-and-configure-using-docker.md
# Artifactory Repository - install and configure using Docker
{% hint style="info" %}
**Feature availability**
Integration with Artifactory Repository is available only with Enterprise plans. For more information, see [plans and pricing](https://snyk.io/plans/).
{% endhint %}
Before installing, review the [prerequisites](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/artifactory-repository-install-and-configure-broker) and the general instructions for installation using [Docker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/install-and-configure-broker-using-docker).
This integration is useful to ensure a secure connection with your on-premise Artifactory Repository deployment.
For information about non-brokered integration with Artifactory Repository see [Artifactory Repository setup](https://docs.snyk.io/scan-with-snyk/snyk-open-source/package-repository-integrations/artifactory-package-repository-connection-setup). For information about brokered integration with Artifactory Container Registry see [Snyk Broker -Container Registry Agent](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/snyk-broker-container-registry-agent).
## Configure Broker to be used for Artifactory Registry
To use the Broker client with an Artifactory Registry deployment, **run** `docker pull snyk/broker:artifactory`. Refer to [Artifactory Repository - environment variables for Snyk Broker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/artifactory-repository-install-and-configure-broker/artifactory-repository-environment-variables-for-snyk-broker) for definitions of the environment variables.
## Docker run commands to set up a Broker Client for Artifactory Repository
{% hint style="info" %}
**Multi-tenant settings for regions**\
When installing, you must add a command in your script to set the `BROKER_SERVER_URL`. This is the URL of the Broker server for the region where your data is hosted. For the commands and URLs to use, see [Broker URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#broker-server-urls).
{% endhint %}
Copy the following command to set up a fully configured Broker Client to use with Artifactory Registry. You can run the Docker container by providing the relevant configuration:
```console
docker run --restart=always \
-p 8000:8000 \
-e BROKER_TOKEN=secret-broker-token \
-e BROKER_SERVER_URL= \
-e ARTIFACTORY_URL=.artifactory.com/artifactory \
snyk/broker:artifactory
```
For an npm or Yarn integration, use the following command.
```
docker run --restart=always \
-p 8000:8000 \
-e BROKER_TOKEN=secret-broker-token \
-e ARTIFACTORY_URL=acme.com/artifactory \
-e RES_BODY_URL_SUB=http://acme.com/artifactory \
snyk/broker:artifactory
```
## Start the Broker Client container and verify the connection with Artifactory Repository
Paste the Broker Client configuration to start the Broker Client container.
You can check the status of the connection by refreshing the Artifactory Integration Settings page. When the connection is set up correctly, there is no connection error.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/artifactory-repository-install-and-configure-broker/artifactory-repository-install-and-configure-using-helm.md
# Artifactory Repository - install and configure using Helm
{% hint style="info" %}
**Feature availability**
Integration with Artifactory Repository is available only with Enterprise plans. For more information, see [plans and pricing](https://snyk.io/plans/).
{% endhint %}
Before installing, review the [prerequisites](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/artifactory-repository-install-and-configure-broker) and the general instructions for installation using [Helm](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/install-and-configure-broker-using-helm).
For information about non-brokered integration with Artifactory Repository see [Artifactory Repository setup](https://docs.snyk.io/scan-with-snyk/snyk-open-source/package-repository-integrations/artifactory-package-repository-connection-setup). For information about brokered integration with Artifactory Container Registry see [Snyk Broker -Container Registry Agent](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/snyk-broker-container-registry-agent).
To use this chart, you must first add the Snyk Broker Helm Chart by adding the repo:
`helm repo add snyk-broker https://snyk.github.io/snyk-broker-helm/`
Then, run the following commands to install the Broker and customize the environment variables. For definitions of the environment variables see [Artifactory Repository - environment variables for Snyk Broker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/artifactory-repository-install-and-configure-broker/artifactory-repository-environment-variables-for-snyk-broker).
For `artifactoryUrl` values do not include `https://`
{% hint style="info" %}
**Multi-tenant settings for regions**\
When installing, you must add a command in your script to set the `brokerServerUrl`. This is the URL of the Broker server for the region where your data is hosted. For the commands and URLs to use, see [Broker URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#broker-server-urls).
{% endhint %}
```
helm install snyk-broker-chart snyk-broker/snyk-broker \
--set scmType=artifactory \
--set brokerToken= \
--set brokerServerUrl=
--set artifactoryUrl= \
-n snyk-broker --create-namespace
```
You can pass any environment variable of your choice in the Helm command. For details, see [Custom additional options for Broker Helm Chart](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-helm-chart-installation/custom-additional-options-for-broker-helm-chart-installation). Follow the instructions for [Advanced configuration for Helm Chart installation](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-helm-chart-installation) to make configuration changes as needed.
You can verify that the Broker is running by looking at the settings for your brokered integration in the Snyk Web UI to see a confirmation message that you are connected. You can start importing Projects once you are connected.
---
# Source: https://docs.snyk.io/snyk-api/reference/asset.md
# Asset
{% hint style="info" %}
This document uses the REST API. For more details, see the [Authentication for API](https://docs.snyk.io/snyk-api/authentication-for-api) page.
{% endhint %}
{% openapi src="" path="/groups/{group\_id}/assets/{asset\_id}" method="patch" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/groups/{group\_id}/assets/{asset\_id}" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/groups/{group\_id}/assets/{asset\_id}/relationships/projects" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/groups/{group\_id}/assets/{asset\_id}/relationships/assets" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/groups/{group\_id}/assets/search" method="post" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
---
# Source: https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/assets-and-risk-factors.md
# Assets and risk factors
The capabilities of the SnykWeb UI Issues menu rely on understanding your application context to help you better prioritize your security issues. It does that by understanding how your application is configured and relying on that information to provide you with triage and prioritization of your assets and issues for the Snyk Essentials plan, and it also adds specific [risk factors](#risk-factors) and [evidence graph](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/using-the-issues-ui/evidence-graph) information.
* [Assets](#assets) are analyzed using Snyk Issues, focusing on images, Kubernetes resources, and packages to understand how they all interact with each other.
* [Risk factors](#risk-factors) are analyzed using Snyk Issues and grouped into four main categories:
* [Deployed](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/assets-and-risk-factors/risk-factor-deployed)
* [OS conditions](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/assets-and-risk-factors/risk-factor-os-condition)
* [Public facing](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/assets-and-risk-factors/risk-factor-public-facing)
## Assets
Snyk issues analyzes the assets described on this page.
### Images
Images are assets that represent a Docker image. Snyk Container performs security scans on the Docker images. Images can be mapped one to many with the Snyk Projects created by the scans performed by Snyk Container. Docker images have natural IDs, which are represented by SHAs. Snyk uses this natural ID to correlate the same images even if they are mapped to different Snyk Projects.
### Kubernetes resources
Kubernetes resources are assets that represent a Kubernetes object. The Kubernetes Connector collects resource information from the Kubernetes clusters.
Kubernetes resources do not map to the Snyk Projects. These are internal entities used to compute certain risk factors, further detailed on the rest of this page. These risk factors can be related to the packages and images.
### Packages
Packages are assets that represent a software package. Snyk Open Source and Snyk Code products perform security scans on files. These files represent the package manager declaration and the source code of a software package, respectively. A package is a representation of that software package.
Packages can be mapped one to one with the Snyk Projects created by the scans performed by Snyk Open Source and Snyk Code. All the issues identified by these products and attributed to these Projects will be mapped to the package entity.
The term Package is a very coarse abstraction. It does not have versions. It is a representation of the current state of the software package at a point in time. The point in time is determined by the time when the Snyk processing pipeline is completed and the state of Snyk Projects at that time.
Snyk Open Source uses the word package to refer to the third-party dependencies declared in the package dependency manifest. Snyk does not currently expose the granularity of the third-party dependencies. However, from the prioritization data model point of view, there is no distinction between third-party and first-party packages. These would be represented as a package object at that point in time.
## Risk factors
By understanding your images, packages, and Kubernetes resources as "application context", Snyk can compute the following risk factors:
* [Deployed](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/assets-and-risk-factors/risk-factor-deployed)
* [OS condition](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/assets-and-risk-factors/risk-factor-os-condition)
* [Public facing](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/assets-and-risk-factors/risk-factor-public-facing)
You can enable and disable all of these "application context" risk factors through the Group **Settings**, on the **Issues** tab. If you choose to disable a risk factor, a provider selection, or the Kubernetes cluster mapping, Snyk will no longer compute them.
Depending on the integration options enabled for your application, risk factors are applied differently. You can [prioritize your integrations](https://docs.snyk.io/manage-risk/set-up-insights#prioritize-your-integrations) by customizing the available Insights options from the Group settings.
Risk factors are supported for stateful entities such as the following Kubernetes components: [StatefulSet](https://kubernetes.io/docs/concepts/workloads/controllers/statefulset/), [DaemonSet](https://kubernetes.io/docs/concepts/workloads/controllers/daemonset/), and [Deployment](https://kubernetes.io/docs/concepts/workloads/controllers/deployment/).
Group settings - Insights tab in the Group settings
{% hint style="info" %}
Risk factor settings may take up to four hours to take effect.
{% endhint %}
---
# Source: https://docs.snyk.io/manage-assets/assets-inventory-components.md
# Assets inventory components
Each inventory layout is presented in a table format, detailing the available key attributes:
* [Assets](#asset)
* [Issues](#issues)
* [Coverage Controls](#coverage-controls)
* [Tags](#tags)
* [Labels](#labels)
* [Developers](#developers)
* [Class](#class)
* [Risk factors](#risk-factors)
* [Source](#source)
* [SCM Repository freshness](#scm-repository-freshness)
* [Clusters](#clusters)
* [Organizations](#organizations)
* [Visibility](#visibility)
## Asset
Assets in Snyk Essentials are meaningful, real-world components in an application’s SDLC. The following asset types are available:
* Repository assets [**`Billable`**](https://docs.snyk.io/snyk-data-and-governance/how-does-snyk-count-assets#billable-assets)
* Container images [**`Billable`**](https://docs.snyk.io/snyk-data-and-governance/how-does-snyk-count-assets#billable-assets)
* Packages
* Scanned artifacts
An asset can be the parent of multiple items. For example, a repository asset usually contains one or more package assets.
The Asset column incorporates the name of the repository asset, package, scanned artifact, or container images. Click on the arrow next to the parent asset name to expand the list of all contained items.
You can copy the name of an asset or browse the repository. Each asset has a menu at the end of the row. Click the menu, then select **Copy** to copy the URL or **Browse** to navigate to the asset repository.
### Repository assets
Repository assets represent SCM repositories. A repository asset is created by discovering the repositories directly in the SCM, when such integration is configured. Alternatively, a repository asset can be created by scanning a repository, (by Snyk or third-party tools) as long as the scanned code is identified with a specific repository (in Snyk, this means filling in the `gitRemoteURL` parameter).
If you scan the code locally using CLI, with no association to a repository, then a repository asset will not be created. For more details about CLI commands, see [Scanning methods](https://docs.snyk.io/scan-with-snyk/snyk-essentials#scanning-methods).
### Container Image assets
You can identify a container image based on the Image ID. If multiple container images have the same Image ID, then only one image asset is generated for that Image ID, enriched with information from all the identified container images for that ID.
Snyk Essentials retrieves all image assets from Snyk Container. Reimport the images to ensure you scan the latest image. If you run a new scan on a Project that contains image assets, it rescans the same image for new vulnerabilities. To identify new image assets, you need to first reimport, and then scan the Project. Check the [Detect application vulnerabilities in container images](https://docs.snyk.io/scan-with-snyk/snyk-container/use-snyk-container/detect-application-vulnerabilities-in-container-images) page for more details.
### Packages
Packages in Snyk Essentials are defined as software or libraries that are managed by package management systems.
Package assets are created when you scan the dependencies of a Project through package management systems or by using the Snyk CLI. This enables Snyk Essentials to identify and analyze the security vulnerabilities of the packages used within a Project, offering insights into possible risk exposures and providing recommendations for mitigation.
### Scanned artifacts
A scanned artifact in Snyk Essentials is an entity detected by Snyk that cannot be identified as a repository asset because it does not include identifying information, such as a Git remote URL.
Scanned artifacts provide users with visibility into what Snyk Essentials detects from scans but require additional troubleshooting.
You can find scanned artifacts in the Inventory Type view, but they are not supported by Policies. Scanned artifacts may include duplicates due to missing identifying information.
## Asset tabs
The asset information is divided into the following tabs:
* [Summary](#summary)
* [Related Assets](#related-assets)
* [Related Projects](#related-projects)
* [Attributes](#attributes)
### Summary
The Summary tab is a concentrated view of asset properties. The Summary screen presents you with the following information:
* **Info**
* Class - specifies the business criticality of the asset.
* Source - specifies the origin of the asset.
* Visibility - lists the visibility status of the repositories.
* Risk factors - provides the list of active risk factors.
* SCM Repository freshness - provides the current status of your repositories, including the date of the last commit.
* **Organization** - specifies the Organizations associated with the asset.
* **Labels** - provides the list of all labels available for that asset.
* **Tags -** provides a key-value pair that allows you to attach structured metadata to your assets.
* **Issues** - categorizes the identified types of open issues.
* **App Context**\* - asset metadata from App Context integrations, such as Backstage catalog or ServiceNow CMDB, can include the following attributes: catalog name, category, application, owner, and so on.
\*App Context information is visible only when the asset is part of a Project for which the application context was configured.
{% hint style="info" %}
After you apply the filters, the assets list will only display the assets that directly match the filter conditions, and, if available, a list of children assets related to the selected one is displayed, with the information shown in a table format, with a focus on the following topics: Asset (name), Issues, Controls, Class.
{% endhint %}
Asset card - Assets Summary view
### Related Assets
The Related assets tab provides a detailed view of assets related to the selected one. Use this tab to assess scanning coverage or asset ownership. You can see the details of a related asset by clicking on one of them. Usually, these are Package assets. When looking at Related Assets, you can notice a link to the parent repository at the top. If you click on the parent asset link, you will revert to the initial view of the parent asset.
Asset card - Assets Summary view of a children asset
### Related Projects
The Related Projects tab provides a collection of Snyk Projects that are associated with a specific asset within the platform. These projects are arranged in a table format, enabling you to view relevant information that assists in managing and assessing vulnerabilities related to the asset. Each Project is displayed with the following details:
* **Projects by Target**: A list of Projects grouped by targets. You can see both the Project name and the target name under which the Project is grouped.
* **Target Reference**: An optional identifier that may not always be available.
* **Test Surface**: Indicates the source of the Project scan, specifying whether it originated from SCM or the CLI.
* **Issues**: Provides insight into the number and severity of identified issues within the Project.
* **Organization**: Displays the Snyk Organization to which the Project belongs.
* **Tested**: Shows the relative time since the last scan (for example, "3 hours ago") along with a tooltip that reveals the full date and time upon hovering.
Projects are sorted by Target, Target Reference, and Tested date. This makes it easy to find related Projects to monitor and fix.
Asset card - Assets Related Projects view
### Attributes
The Attributes tab shows miscellaneous attributes, like the Asset ID or Asset Type, that are fetched from the data source, but do not have a dedicated column. The benefit of having this info is not only by presenting it but mostly by making it searchable. You can search for an attribute by either using the inventory search bar or the filters.
Asset card - Assets Attributes view
## Issues
The Issues column is designed to present a comprehensive list of issues that have been identified within your assets. These findings are the result of scans performed by Snyk as well as internal tools you may have deployed. This detailed list not only helps in understanding the security posture of your assets but also in prioritizing remediation efforts based on the severity and impact of each issue. By having visibility into these issues, you can take proactive steps toward improving the overall security of your applications and infrastructure.
Most of the issues are mapped to an asset. However, some of the issues are associated with an asset but not directly linked to it. This is the case with image assets.
You can see these assets under the **Inventory** view, and they will also be reflected under the **Asset and Source Code** column from the **Insights UI** view.
The **Issues** column from the Asset view is designed to present an aggregated count of open issues. These counts are carefully categorized based on the severity level of the issues found in assets, their children's assets, or associated packages. The severity is divided into four distinct levels:
* **C** (Critical): Issues that represent a serious threat and should be addressed immediately to prevent potential exploits or major disruptions.
* **H** (High): These are significant issues that, while not immediately dangerous, could potentially lead to critical vulnerabilities if not resolved in a timely manner.
* **M** (Medium): Issues of medium severity might not pose an immediate threat but are still important to fix as part of regular maintenance to improve overall security and functionality.
* **L** (Low): These are considered minor issues that have a low impact on the security of the system and operation but should still be addressed to maintain optimal performance and prevent future vulnerabilities.
This classification streamlines prioritization, helping you focus on critical areas and optimize remediation.
## Coverage Controls
The Controls column displays all of the Snyk products that were executed on a specific repository asset. This column displays, in circles, a logo for each Snyk product. The logo icon itself has an indication of the highest severity of issues from this source. For example, if the highest severity issue is **C** (critical), you can see a red dot on the control icon.
The Controls logos can have one of the following states:
| Logo | Description |
| ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------- |
| | The Snyk product was executed. |
| | The Snyk product was executed but with issues. |
| | The Snyk product should have been executed but was not executed. |
| | The Snyk product was executed and failed. |
| | The Snyk product was executed and failed with issues. |
| | The Snyk product was executed and failed due to not being covered by the policy. |
Click a Controls logo to see **Last test** details and the **Issues** count, split by severity. This reflects the most recent time that the asset was scanned by a specific product.
Inventory menu - Controls column
## Tags
Provides a key-value tagging capability that allows you to attach specific, structured metadata to your assets. Use this feature lets you granular filtering, robust policy creation, and better alignment with your internal systems.**Example:** A structured tag provides both a key and a value, such as `platform:aws` or `region:eu-central-1`.
## How to filter assets by tags
In Snyk Web UI, you can filter assets by their tags using **Advanced filters**. You can define filters based on specific criteria, such as a property of an asset, a condition, and a value.
* Filter by `Tags`: The new `Tags` filter is a key-value pair filter. This filter allows you to select a specific tag key such as `department` and then choose a corresponding value such as `finance` to narrow down the asset list.
Advanced Filters - Tags
## Labels
Asset labels are metadata that is applied to repository assets and build artifacts. You can use asset labels to tag based on predefined values, manage and apply security policies, and group assets based on common characteristics. The following asset types are available:
* **GitHub custom properties** - lists the GitHub custom properties associated with your GitHub repository as a label.
* **User-defined labels** are customizable, as you can define their logic through Assets Policies. For example, you can set labels to represent a repository that comes from a specific source, such as GitHub. Labels associated with assets are identified in the UI with the **Asset policy label's** name.
* **System labels** are automatically assigned by Snyk based on asset names or detected keywords (for example, `codeowners`).
A repository asset label can be added through Policies or be system-generated by Snyk Essentials to provide more context. Click on a labels field to view all labels.
{% hint style="info" %}
BitBucket Cloud cannot automatically detect the language used in the source code from the repositories. In Snyk Essentials, you can only see the language labels that have been manually added for BitBucket Cloud.\
Language data is not available for BitBucket Server.\
For more information, you can refer to the official documentation provided by BitBucket.
{% endhint %}
A system-generated label includes the following information:
* **Technology** - The languages detected by Snyk Essentials in the source code within a repository asset.
* **SCM Topic** - The topics found in the integrated SCM repositories. Snyk Essentials supports topics from GitHub and GitLab.
* **Asset type label** - The label explaining the type of the asset. For example, container assets will be assigned an image asset label.
* **SCM Repository freshness** - The status of the repository and the date of the last commit.
* **Active**: Had commits in the last 3 months.
* **Inactive**: The last commits were made in the last 3 - 6 months.
* **Dormant**: No commits in the last 6 months.
* **N/A**: There are no commits detected by Snyk Essentials.
### Labels rules overview
Inventory menu - Labels column
Labels are organized into three main categories:
* GitHub custom properties
* Asset policy labels
* System labels
System labels are automatically generated from the SCM repositories. System labels can be classified into three main categories:
* Languages:
* This applies to GitHub, GitLab, Azure DevOps, and BitBucket as long the data is available in the repository.
* GitHub, GitLab, and Azure DevOps have automated language detection. Instead, BitBucket requires users to set up the language in their repositories.
* SCM Topic:
* This applies to GitHub and GitLab.
* Multiple different rules based on the words we found in the repositories:
* This applies to GitHub, GitLab, Azure DevOps, and BitBucket.
### Labeling policy
You can use pre-defined system labels and asset labels to mark the repositories that meet your filter criteria. Check the following [Labeling policy](https://docs.snyk.io/manage-risk/policies/assets-policies/use-cases-for-policies/tagging-policy) use case.
### Labeling rules related to metadata
| Rule | Label |
| --------------------------------------------------------------------- | ----------------- |
| Snyk Essentials found technologies in use. | `< technologies>` |
| Snyk Essentials found languages from the SCM. | `` |
| Snyk Essentials detected a new repository created in the last 7 days. | `new repository` |
| Snyk Essentials found the code Project with the code owner. | `codeowners` |
In the Snyk web interface, you can filter assets by their labels using the **Advanced Filters** option. You can define filters based on highly specific criteria, such as a property of an asset, a condition, and a value.
* Filter by `labels`: This filter allows you to select a specific label.
Advanced Filters - By Labels
## Developers
You can see the list of all the developers that worked on that specific asset. The details list includes the SCM profile details for code committers to the repository asset.
## Class
Reflects the business criticality of the asset from A (most critical) to D (least critical), as you defined it in the Policies view.
You can manually change the business criticality of an asset. Click the criticality level and select another one from the list.
You can change the business criticality of an asset. To manually update it, select the criticality level and select another one from the list.
After manually setting the value of a class, you have the option to lock the value to prevent any potential overriding by a policy that has the Set Asset Class as an action. You can lock the value from the general or summary views of an asset. You can unlock the class value at any time by clicking the lock icon. A pop-up is displayed, asking you for confirmation about unlocking the value.
Inventory menu - Lock the value of a class
The Asset Class column is available on the Insights UI for risk-based prioritization, and it has the same functionality as it does here. At the moment, the Asset Class column is available only for repository assets, and applicable only for Snyk Code.
{% hint style="info" %}
Synchronization between the Asset Class and the Insights UI can take up to 3 hours.
{% endhint %}
The class value can be auto-generated with policies. You just need to create a policy that has as an action **Set Asset Class**.
## Risk factors
The Risk Factors column lists the potential vulnerabilities and security threats associated with each asset. These risk factors help users identify specific risks, enabling them to prioritize and address issues more effectively. By understanding the particular risks tied to their assets, users can take more informed remedial actions.
Here is a list of the available risk factors:
* [Deployed](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/assets-and-risk-factors/risk-factor-deployed)
* [Loaded package](https://docs.snyk.io/manage-assets/broken-reference)
* [OS Condition](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/assets-and-risk-factors/risk-factor-os-condition)
* [Public facing](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/assets-and-risk-factors/risk-factor-public-facing)
## Source
The Source column in Snyk Essentials helps users identify the origin of their assets, which can be directly from Snyk, through SCM systems, or using third-party integrations. This feature simplifies asset management and risk prioritization by providing clear visibility into the origin of each asset and it enables more effective security strategies and remediation efforts.
## SCM Integrations
The SCM Integrations column indicates how each SCM was integrated into Snyk at the Group or Organization level. Full context enrichment is available at the Group level, while testing is available at the Organization level.
The column is hidden by default, and you can enable it in the **Columns** section. It shows the following value:
* **Snyk Org**: The Snyk Organization-level integration is used for import and testing.
* **Snyk Essentials**: The Snyk Group-level integration is used for discovery and asset enrichment.
## SCM Repository freshness
The SCM Repository freshness column provides you with an immediate understanding of the current status of your repositories, including the date of the last commit. This assists you in quickly identifying active and dormant Projects and helps you make decisions regarding maintenance, security patching, and resource allocation.
The repository freshness displays the repository status according to the last commit date:
* **Active**: Had commits in the last 3 months.
* **Inactive**: The last commits were made in the last 3 - 6 months.
* **Dormant**: No commits in the last 6 months.
* **N/A**: Commits data is unavailable.
## Clusters
The Clusters column lists all cluster names where an image is deployed and is using the runtime integrations as the source of the information. When an image is removed from a cluster, the cluster name is also deleted from the collection. Clusters are also available under Filters and allow you to filter assets in the Inventory view or to create policies in the Policies view.
{% hint style="info" %}
The Cluster column is populated only when the Snyk Runtime Sensor is utilized.
{% endhint %}
## Organizations
The Organizations column lists all Snyk Organizations associated with each asset. This includes the names of Organizations that contain Projects linked to the asset, enabling users to filter and organize their asset inventory based on their organizational structures. Organizations are also available under Filters and allow you to filter assets in the Inventory view or to create policies in the Policies view.
## Visibility
The Visibility column lists the visibility status of the repositories as follows:
* **Public**: Repositories that are publicly accessible.
* **Private**: Restricted repositories.
* **Internal**: Internal repositories specific to GitHub and GitLab.
* **N/A**
Use this metadata to prioritize risk and apply visibility-based coverage controls. The column is unavailable for image assets and is excluded from [report filters](https://docs.snyk.io/manage-risk/reporting#snyk-reporting-filters).
## Actions
The Actions column provides a workflow to set up the SCM integration at the Group level to access full context enrichment. To identify the type of integration, check the [SCM Integrations column](#scm-integrations). This use case applies where a Group-level integration is not configured.
If a Group level integration has not been set up, repositories discovered at the Organization level display a **Set up integration** button under the **Actions** column. If you set up the integration at the Group level, this option becomes unavailable.
To add context enrichment, find an asset and select **Set up integration**. For configuration details, see [Snyk SCM Integrations](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations).
Set up SCM integration at the Organization level from the Actions column
---
# Source: https://docs.snyk.io/manage-assets/assets-inventory-filters.md
# Assets inventory filters
From the **Inventory** > **All Assets** tab, you can use the search bar to look for specific keywords across assets. Results can include the asset name and data retrieved from the **Attributes** tab of an asset.
## Quick filters
Quick filters are predefined filters that you can apply to assets. Available quick filters:
* **Assets with Risk factors `DEPLOYED` and `PUBLIC FACING` and `COVERAGE GAP`**: displays only the assets that have both `Deployed` and `Public facing` risk factors, and that have a coverage gap for the selected Snyk products.
* **Assets with Risk factors `DEPLOYED` and `COVERAGE GAP`**: displays only the assets that have the `Deployed` risk factor, and with a coverage gap for the selected Snyk products.
* **Assets with Repository freshness** **`ACTIVE` and `COVERAGE GAP`**: displays only the assets from active repositories and with a coverage gap for the selected Snyk products.
* **Assets with Asset Class `A` and `COVERAGE GAP`**: displays only Class A assets that have a coverage gap for the selected Snyk products.
You can change or add additional filters by clicking **Advanced Filters**.
## Advanced filters
Using advanced filters, you can define and apply filters to assets based on specific criteria. For details on how to define filters, see [Define filters](https://docs.snyk.io/manage-risk/policies/assets-policies/create-policies#define-filters).
When you select advanced filters, you can specify one or more sets of criteria:
* **Property**: a characteristic of the asset. You can select it from a dropdown list.
* **Condition:** depends on the asset selected (such as `contains` or `does not contain` for `asset name`).
* **Value:** depends on the **Property** and **Condition**.
You can add as many filters as needed by clicking **Add Filter**.
{% hint style="info" %}
If you are using Snyk Essentials for the first time, Snyk recommends starting with the **Coverage** filter to determine where Snyk is already implemented.
{% endhint %}
You can filter the information for all inventory layouts using the available filters in the **Advanced Filters** section.
{% hint style="info" %}
The filters **Application**, **Catalog name**, **Category**, **Lifecycle**, **Owner**, **Title** are visible only if you have configured the [application context](https://docs.snyk.io/developer-tools/scm-integrations/application-context-for-scm-integrations) catalog for your SCM integrations.
{% endhint %}
* **Application**: the applications for which you have configured the application context catalog in Snyk Essentials.
* **Asset ID**: the unique identifier of the asset.
* **Asset name**: the name of the asset.
* **Asset type**: the repository, package, or scanned asset.
* **Attribute**: the asset attributes.
* **Catalog name**: the name of your application context catalog.
* **Category**: the category of a repository asset. For example, a service or a library.
* **Class**: the class of the asset.
* **Clusters**: the cluster names where the asset is deployed. An asset can be deployed in more than one cluster.
* **Coverage**: the Snyk products used to scan the asset. This filter identifies the assets scanned by the products at least once.
* **Coverage gap**: the products for which the asset has not been scanned and that do not meet the Set Coverage Control Policy requirements. The coverage gap applies only if you previously defined the coverage requirements of an asset and the asset has never been scanned, or the last scan is older than the default scanning frequency.
* **Developers**: the developers who contributed to the asset.
* **Discovered**: the period when the asset was discovered.
* **Issue severity**: the severity of the issue: critical, high, medium, or low.
* **Issue source**: where the issue was identified - SCM or third-party integrations. A source category is visible only if there is at least one source present.
* **Last seen:** the most recent time the asset was detected by Snyk in any of the sources.
* **Lifecycle:** the lifecycle state of the application context catalog component.
* **Locked attributes**: specifies if the attribute value is locked.
* **Organizations**: all the Snyk Organizations that are mapped to an asset.
* **Owner:** the team owning the repository for which the application context catalog was configured.
* **Related package**: a package related to the asset.
* **Repository freshness:** the status of the repository and the date of the last commit.
* **Active**: with commits made in the last 3 months.
* **Inactive**: with commits made in the last three to six months.
* **Dormant**: with no commits made in the last six months.
* **N/A**: with no commits detected by Snyk Essentials. This filter indicates that the repository was detected through a Snyk scan but not directly from the SCM. To detect SCM repositories, you must set up SCM integration at the Group level.
* **Risk factors**: the available risk factors. Risk factors refer to assets that are vulnerable to security threats due to their exposure, sensitivity, compliance with security standards, and vulnerability history.
* **SCM Integrations**: specifies how the asset is integrated with Snyk - at Group or Org level.
* **SCM Organization**: the SCM Organization or Workspace where this asset is located.
* **SCM Project**: the Project in the Azure DevOps or Bitbucket SCM integrations, where this asset is located.
* **Source**: the source of the asset.
* **Tags**: information about the detected languages and repository update status.
* **Title**: the name of the component for which the application context catalog was configured.
* **Visibility:** the visibility status of the repositories. This can be:
* **Public**: repositories that are publicly accessible.
* **Private**: restricted repositories.
* **Internal**: internal repositories specific to GitHub and GitLab.
* **Source** - specify the asset source.
* **Tags** - information about the detected languages and repository update status.
* **Title\*** - represents the name of the component for which the application context catalog was configured.
## Filtering using the unenriched repository banner
The unenriched repositories banner highlights repositories not discovered by your Group-level SCM integration. These assets are instead discovered through Organization-level integrations or Snyk targets.
You can filter unenriched repositories directly from the banner by selecting the repository count.
#### Unenriched assets with Group SCM integration
If you use a Group-level integration, the banner shows assets not discovered through that integration. Although the integration is in place, some assets are not being pulled in. Possible reasons for unrenriched assets:
* Organization-level integration has broader permissions than the Group-level integration.
* A repository previously tested by Snyk was deleted in the SCM before the Group-level integration was set up.
* Snyk CLI Projects have a different repository URL than the SCM repository URL.
* The SCM Organization is not explicitly configured in the Group-level integration.
* Assets detected by a non-SCM vendor (for example, GitGuardian).
## Troubleshooting
### The assets are not discovered by Group or Organization-level integrations.
The assets are not discovered by Group or Organization-level integrations, but are discovered only through Snyk targets (for example, CLI Projects or old deleted repositories).
To resolve this issue, select the unenriched repositories that are not being discovered through Group-level SCM integration and filter SCM sources to isolate assets not discovered by integrations.
Example:
Check if the repository still exists in the SCM. If it was deleted, remove the asset by deleting its Snyk target.
If the repository was imported but not rediscovered, ensure the Group and Organization-level permissions for the SCM integration are correct.
### The assets are discovered by Organization-level integrations, but not by Group-level integrations
If the assets are discovered by Organization-level integrations but not by Group-level integrations:
* Check for a missing SCM Organization in a Group-level integration.
* Compare permissions between Group- and Organization-level integrations.
* Confirm if the asset was discovered through a different vendor integration.
### Deeper analysis of asset source
For deeper analysis, use the dashboard to see where assets are discovered and how they are enriched. Use the advanced filters to display only assets that have a source other than Snyk (for example, GitHub).
For GitHub and Azure, ensure that the Organization is included in the profile. To see the Organization name, click an asset and navigate to the **Attributes** tab.
If your profile includes the Organization, check the token permissions and ensure access to the unenriched repositories.
For GitLab and BitBucket, ensure that the Group-level tokens have access and the right permissions for the Organization.
If the source of the asset is another vendor, ensure the repository URLs match in order to avoid creating duplicate assets.
---
# Source: https://docs.snyk.io/manage-assets/assets-inventory-layouts.md
# Assets inventory tabs
Snyk defines an asset as a meaningful, real-world component in an application’s SDLC, where meaningful means either carries a risk or aggregates risk of other components (for example, repositories that contain packages), and real-world means that the concept exists outside of Snyk, for example, repository (which is a generally applicable term). In most cases, assets carry a risk or aggregate risk of other components, such as repositories that contain packages.
Snyk Essentials inventory tabs are organizing your repository assets in meaningful ways, enabling you to:
* Gain full repository asset visibility from your SCM tools, including details about configured teams and repository code committers.
* Track controls coverage for Snyk products.
* Prioritize coverage mitigation efforts according to business impact.
* Use automatic repository discovery to surface repositories that have not yet been imported into Snyk to identify coverage gaps.
{% hint style="info" %}
Each line in the inventory represents an asset.
{% endhint %}
## Inventory tabs
To get better context and clarity over your asset inventory, Snyk Essentials allows flexible structuring with inventory tabs. Snyk Essentials includes five inventory tabs and groups assets by different contexts. You can find all inventory tabs under the Inventory menu option at the Group level:
* **Overview:** Provides quick insights into discovered repositories, enabling AppSec teams to effectively operationalize their program using Snyk.
* **All Assets:** All the discovered assets are grouped by their type.
* **Asset Hierarchy**: Shows assets in a hierarchical structure. The list of assets is sorted by issue counts, and, where applicable, the package assets are listed underneath the repositories where they are located. The Assets Hierarchy is visible only when no filters are applied.
* **Teams**: SCM repository assets are grouped by teams. On this tab, you can only see SCM Organizations with teams and repositories assigned to a team.
* **Technology**: SCM repository assets grouped by technology, as detected and tagged by Snyk Essentials.
Each inventory tab may include different counts of assets and scanned artifacts, depending on the grouping context. Otherwise, all columns and data manipulation features are the same on each inventory tab.
You can filter the information for all the inventory tabs and use any of the available filters listed on the [Assets inventory filters](https://docs.snyk.io/manage-assets/assets-inventory-filters) page.
### Inventory Overview
The Overview tab in Snyk Inventory provides insights into the discovered repositories, highlighting key features and characteristics such as the total number of discovered repositories and the distribution of tested and not tested repositories, the number of dormant repositories or coverage details based on the asset policies.
Provides quick insights into discovered repositories, enabling AppSec teams to effectively operationalize their program using Snyk. This helps reduce coverage gaps, organize and leverage asset context, and ensure compliance with coverage policies.
#### Repositories tested
Use this widget to get an overview of all repositories discovered and the number of repositories that Snyk has not yet tested. Click the **Not Tested** section of the widget to see the full list of not-tested repositories. You can import all not-tested repositories into the correct Snyk Organization so that they can be tested.
#### Control coverage gaps
Use this widget to get a clear overview of all discovered repositories and see how many have at least one control coverage gap, as defined by an asset policy. A repository with a coverage gap is a repository that does not meet the coverage requirements set in the asset policy. The coverage gaps are automatically highlighted using a default policy applied to new Groups, helping you reduce application risk.
Follow the next steps to remediate the coverage gaps:
1. Click "Coverage gap" to see all affected repositories.
2. Determine the reasons for the policy non-compliance.
3. Remediate and bring repositories into compliance.
4. Set up an asset policy.
#### Dormant repositories
Use this widget to see all dormant repositories with critical and high-risk issues. A dormant repository is one that has not had any commits in the past six months. Based on this information, you can decide whether to decommission or fix stale repos.
#### Languages with most issues
Use this widget to identify the programming languages that often present issues within your codebase. If you hover over any of the listed languages, you can see and access the Snyk Learn training focused on setting up, integrating, and customizing the selected language.
#### Class A repositories with most high and critical issues
Use this widget to see a maximum of top ten high-risk Class A repositories with the biggest impact on the business (class A). This tool helps your development team identify and prioritize remediation efforts with asset context. By addressing high-risk areas promptly, you improve the stability and security of your Project, ultimately enhancing software quality.
### All Assets
The All Assets tab under the Inventory menu provides a central view of all your assets, offering a comprehensive overview of your security posture. You can access a list of your assets and customize the view to meet your needs. Select the columns that you want to be visible, use filters to refine the information, and export the details to share them with others.
This unified view allows you to efficiently monitor assets and prioritize remediation for stronger application security.
### Asset Hierarchy
The Asset Hierarchy in Snyk Inventory organizes all assets in a structured, hierarchical format.\
Assets are sorted by issue counts, and where applicable, package assets are listed underneath the repositories where they are located.
The Asset Hierarchy is visible only when no filters are applied, allowing you to see a clear, unfiltered view of your assets and their relationships.
This layout helps in understanding the relationship between different assets and their associated issues, providing a comprehensive view of the asset landscape within your Organization.
### Teams
The Teams tab in Snyk Inventory organizes assets from SCM repositories by team. Assets are grouped here according to the teams assigned to them within the SCM organizations.
Only SCM organizations that have teams and repositories assigned to a team will appear in this layout. This helps in visualizing and managing repository assets according to team structures, making it easier to track and prioritize security efforts based on team responsibilities.
### Technology
The Technology tab in Snyk Inventory groups SCM repository assets by the technology they use, such as programming languages and frameworks. This categorization is detected and tagged by Snyk Essentials, allowing you to easily identify and manage assets based on the used technologies.
This feature helps in understanding the technological landscape of your repositories and can be useful for prioritizing security efforts and managing risks associated with different technologies.
## Assets and their attributes
Every item listed in the inventory is considered an individual asset. Most assets are actual components of the application (code repositories, domains, endpoints, and so on), but an asset can also represent a Group (certain business unit) or even a product.
Assets in the inventory are presented with key attributes in the following columns:
* **Asset** - The name of the repository asset, scanned artifact, and the Git remote URL, if available. Scanned artifacts are missing Git remote URLs.
* **Issue** - The number of issue counts on open assets aggregated across all relevant tools of the same severity of the asset itself and its child assets or packages. The severity level is classified into **C** (critical), **H** (high), **M** (medium), and **L** (low).
* **Controls** - A report detailing all products detected by the Snyk Essentials on a specific repository asset and all products that should be but are not covered by Snyk Essentials.
* **Tags** - You will be able to add a unique key-value tag to provide a more powerful and granular context for your assets. This attribute lets you attach specific, unique metadata to your assets, which enables precise filtering, robust policy creation, and alignment with your internal systems.
* **Labels** - Snyk Essentials automatically labels repository assets with information about the used technologies (Python, Terraform, and so on) in the repository, and repository latest updates. You can also use policies to label repository assets.
* **Developers** - Includes the SCM profile details for code committers to the repository asset.
* **Class** - Reflects the business criticality of the asset from A (most critical) to D (least critical), as defined by the user in the Policies view. You can manually change the class or automatically change it by applying a policy. You can lock the value you have manually set for a Class to prevent policies from overriding it.
* **Risk factors** - Lists the potential vulnerabilities and security threats associated with each asset and helps users identify specific risks, enabling them to prioritize and address issues more effectively.
* **Source** - Reflects the source of the asset, which can come from Snyk, an SCM, or a third-party integration.
* **SCM Integrations** - Shows how each SCM was integrated at the Group or Organization level. By understanding the source of the SCM integration, you can determine if you require a Group-level integration to unlock full asset context.
* **SCM Repository freshness** - Reflects the status of the repository and the date of the last commit.
* **Clusters** - Provides a list of all the cluster names where the image asset is deployed.
* **Organizations** - Provides a list of the Snyk Organizations that are mapped to the asset.
* **Actions** - Provides a workflow to set up an SCM integration, enriching the asset context with information such as labels, developers, and repository freshness. This use case is available when a Group-level integration is not configured.
{% hint style="info" %}
The Clusters column is hidden by default. To enable it, click Columns, select Clusters from the dropdown list, then click Apply to save the changes.
{% endhint %}
### **Asset Sources, Types, and Scanned Artifacts**
Snyk Essentials automatically derives assets from Snyk and any SCM tools that are onboarded using the Snyk Essentials Integration. SCM tools from the Snyk Essentials Integration may add additional repositories that are not scanned by Snyk and additional contexts, such as teams and code committers.
### Repository assets, scanned artifacts and packages
#### Repository assets
Snyk Essentials supports repository assets (from main branches) as an asset type. Repository assets are visible in all inventory layouts and are supported by Policies. To avoid duplication, assets are identified using a unique identifier, which is the git remote URL for repository assets.
{% hint style="info" %}
For Snyk Essentials SCM imported repositories, archived or deleted repositories will not be displayed in the asset inventory and will not be shown in the dashboard widgets.
{% endhint %}
#### Scanned artifacts
Snyk Essentials also includes the concept of scanned artifacts. A scanned artifact is an entity detected by Snyk that cannot be identified as a repository asset because it does not include identifying information, such as a Git remote URL.
Scanned artifacts provide users with visibility into what Snyk Essentials detects from scans but require additional troubleshooting.
You can see the scanned artifacts in the Inventory Type view. The scanned artifacts are not supported by Policies. Furthermore, scanned artifacts may include duplicates, as identifying information is missing.
#### Packages
Packages are defined as software or libraries that are managed by package management systems.
Package assets are created when you scan the dependencies of a Project through package management systems or by using the Snyk CLI. This enables Snyk Essentials to identify and analyze the security vulnerabilities of the packages used within a Project, offering insights into possible risk exposures and providing recommendations for mitigation.
---
# Source: https://docs.snyk.io/manage-risk/policies/assets-policies.md
# Assets policies
## Overview
With Policies, you can easily automate the process of adding business context and receiving notifications.
{% hint style="info" %}
After a policy is created, it is run in a maximum of 3 hours after creation, then once every 3 hours.
If your policy is set to run daily, then the policy is run 3 hours after the 24-hour period ends. You can always manually run a policy by using the Run button.
{% endhint %}
Access the Snyk Essentials policies by positioning yourself at the Group level, selecting **Policies**, then **Assets**.
The following video presents an overview of the types of policies you can create from the Policies view.
{% embed url="" %}
Overview of asset policies
{% endembed %}
{% hint style="info" %}
[Manage assets](https://docs.snyk.io/manage-assets/manage-assets) and [assets policies](https://docs.snyk.io/manage-risk/policies/assets-policies) are interconnected. Before setting up any new policy, ensure you have reviewed and filtered your assets from the Inventory menu.
{% endhint %}
## Use Cases
You can create policies for organizing the assets, classifying them, and always being up to date with the latest information about an asset.\
Common use cases for policies include:
* [New asset notifications](https://docs.snyk.io/manage-risk/policies/assets-policies/use-cases-for-policies/notification-policy)
* [Asset classification](https://docs.snyk.io/manage-risk/policies/assets-policies/use-cases-for-policies/classification-policy)
* [Asset tagging](https://docs.snyk.io/manage-risk/policies/assets-policies/use-cases-for-policies/tagging-policy)
* [Coverage control](https://docs.snyk.io/manage-risk/policies/assets-policies/use-cases-for-policies/coverage-control-policy)
### New asset notifications
Notify members of the AppSec team when new assets meeting certain criteria are discovered. For example, you may send a Slack message to the infra team if new repository assets that leverage Terraform as a technology are detected by Snyk Essentials.
When setting up a notification action (email or Slack) for a policy, you can include a link to the relevant assets. Each notification will list all the assets impacted by the policy. You can view the assets individually, or you can see a summary of all the assets by clicking the **Click Here** option in the notification. The list of assets displayed in the email notification is automatically generated.
### Asset classification
Classify repository assets according to their business criticality from A (most critical) to D (least critical), based on asset properties such as name and tags. For example, you might indicate that any repositories that contain “customer-portal” in the name should be classified as A, given that the customer-portal app holds sensitive data.
### Asset tagging
Categorize and label repository assets with [asset tags](https://docs.snyk.io/manage-assets/assets-inventory-components#tags) to filter the asset inventory.
* **GitHub custom properties** - lists the GitHub custom properties associated with your GitHub repository as a tag
* **User-defined tags** are customizable, as you can define their logic through [Assets Policies](https://docs.snyk.io/manage-risk/policies/assets-policies). For example, you can set tags to label a repository that comes from a specific source, such as GitHub. Tags associated with assets are identified in the UI with the **Asset policy tags** name.
* **System tags** are automatically assigned by Snyk based on asset names or detected keywords (for example, `codeowners`).
### Security coverage
Monitor if your assets are scanned by the selected security products. You can select one or multiple security products and also specify a timeframe for when the last scan should have taken place.
---
# Source: https://docs.snyk.io/manage-risk/policies/assign-a-policy-to-an-organization.md
# Assign a policy to an Organization
When you create a policy, you can apply it to one Organization. You cannot directly apply an Organization to or remove an Organization from the default policy using the Policy Manager.
{% hint style="info" %}
Policies applied to Organizations are in effect when you run the `snyk test` or `snyk monitor` CLI commands.
{% endhint %}
## Apply a policy to an Organization
To apply a policy to an Organization, in the Organization selector panel, check the box for the Organization to which you want to apply the policy.
If an Organization has another policy applied, you can see that policy from the selector, and the policy indicator next to the Organization name will be gray.
Gray indicator - Organization has another policy applied
If the Organization already has the policy applied, the name of the policy is displayed in a yellow indicator next to the Organization name.
Yellow indicator - Organization already assigned to this policy
If you are applying a different policy to an Organization, in order to move that Organization from one policy to another, two indicators appear next to the Organization name in the selector. One shows, in yellow, the policy that is currently applied. The other shows the policy you will be applying to the Organization in gray.
Gray and Yellow indicators - Policies applied to the Organizaiton and to be applied
## Remove a policy from an Organization
To remove a policy from an Organization, uncheck the box next to the Organization you want to remove from the policy that you are viewing.
Remove an Organization from a policy
The unchecked Organization will now automatically revert to the default policy.
## Apply the default policy to an Organization
Remove the policy currently applied to the Organization. The Organization will automatically revert to the default policy.
## Remove the default policy from an Organization
Apply a new policy to the Organization. The Organization will automatically be removed from the default policy and the new policy will be applied.
---
# Source: https://docs.snyk.io/manage-risk/policies/assign-policies-to-projects.md
# Assign policies to Projects
After you apply [Project attributes](https://docs.snyk.io/snyk-platform-administration/snyk-projects/project-attributes) to your Projects, you can create policies that apply to those attributes. Projects and policies are linked based on the attributes that have the policy applied.
{% hint style="info" %}
Policies applied to Project attributes always take precedence over policies applied to Organizations.
{% endhint %}
A policy can be applied to one or multiple Project attributes, but only one policy can be applied to a set of attributes. For example, if there is a policy applied to `Critical`**,** `Production`,`Frontend`, you cannot create another policy that is applied *only* to these exact attributes.
{% hint style="info" %}
Policies applied to Project attributes affect the CLI command `snyk monitor`, assuming it runs on a CLI Project that has Project attributes assigned. Project attributes applied to policies do not affect `snyk test`.
{% endhint %}
## Apply a policy to Project attributes and remove a policy
To apply a policy to an attribute, in the attribute selector panel, check the box for the attribute to which you want to apply the policy.
You can also search for tags that have already been created in Projects in your Group. You can select more than one tag for the policy.
Attribute selector panel
To remove a policy from an attribute, uncheck the box next to the attribute from which you want to remove the policy.
To remove a tag, click the **x** next to the tag.
{% hint style="info" %}
You can create and save a policy where no attributes are selected, for example, if you have not yet decided the attributes to which the policy should be applied. A policy cannot apply to Projects if all attributes are left blank.
{% endhint %}
## Assign Projects to policies
To have a policy assigned, a Project must have all the attributes listed on the policy applied to the Project. The Project can also have attributes that are not listed on the policy.
{% hint style="info" %}
If a policy applies to a Project based on the attributes, then role with edit project attribute permission can edit the Project attributes.
{% endhint %}
If multiple tags are added to a policy, the Project needs to match with only one of the Project tags. However, if other attributes are also listed on the policy, the Project would need to have all the attributes and at least one of the listed tags.
For example, if you have a policy applied to `Critical`, `External`, and `Frontend`, this policy is assigned to Projects that have the same attributes, but not to a Project with the attributes `Critical` and `External` only.
An example policy follows. It is applied to an attribute in the **Business Criticality** section, `Critical`, and to attributes in the **Environment** section, `Frontend` and `External`. The policy also has two Project tags. The first tag has the key `PCI`, with the value of `Compliant`. The second tag has the key `owner`, with the value of `fred`.
The following Project has the attributes `Frontend`, `External`, and `Critical`, and has at least one matching tag, `PCI:Compliant`. Thus the Project will inherit the policy, that is, the policy is assigned to this Project.
Project inheriting a policy
The following Project will not inherit the policy, because the Project lacks the `External` environment attribute.
Project not inheriting a policy
## Assign multiple policies to a Project
Multiple policies can be assigned to a Project. For example, you may have a policy applied to the attributes `Critical` and `External` and another policy applied to the attributes `Critical` and `Production`. If you have a Project with the attributes `Critical`, `External` and `Production`, both policies are assigned.
When multiple policies are assigned to a Project, the order of the policies on the policy manager page determines precedence. The policy closest to the top of the list takes precedence over other assigned policies after it. To change the order of policies, either drag and drop the policies into the order you want or use the three dots on the right-hand side to move the policy up or down in the list.
Change policy order
---
# Source: https://docs.snyk.io/discover-snyk/snyk-learn/snyk-learn-reports/assignment-reports.md
# Assignment reports
{% hint style="info" %}
Snyk Learn assignment reporting is available only in the Learning Management add-on offering. For more information, contact your Snyk account team.
{% endhint %}
After you have created your first [Assignments](https://docs.snyk.io/discover-snyk/snyk-learn/snyk-learn-assignments) with Snyk Learn, you can use the Assignment reporting to track progress at an organization level. This is useful for team managers, security champions, AppSec engineers and compliance team members to follow up on detailed progress, and to extract reports for compliance usage.
## Assignment Report
By navigating to your [assignment](https://learn.snyk.io/admin/assignments/) dashboard, you can find reports showing your organizational progress against assignments.
You can also use the filter to drill down further, and the buttons below the filter allow you to change the due date, delete assignments, and also trigger email reminders for your users.
### Changing the due date
First select the assignments you would like to change the due date for, and then press the icon highlighted in the image below. You will then be asked to pick a new date. This updates the due date, and also updates the users Learning Progress dashboard.
### Sending a reminder email
By selecting an assignment and then pressing the button highlighted you can trigger a reminder email to be sent. You will be offered the chance to add a custom message before the reminder is sent.
---
# Source: https://docs.snyk.io/snyk-api/reference/audit-logs.md
# Audit Logs
{% hint style="info" %}
This document uses the REST API. For more details, see the [Authentication for API](https://docs.snyk.io/snyk-api/authentication-for-api) page.
{% endhint %}
{% openapi src="" path="/orgs/{org\_id}/audit\_logs/search" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
{% openapi src="" path="/groups/{group\_id}/audit\_logs/search" method="get" %}
[rest-spec.json](https://2533899886-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MdwVZ6HOZriajCf5nXH%2Fuploads%2Fgit-blob-42ebe7ebbe084db5ba66cf53a50453b68b5c9ab0%2Frest-spec.json?alt=media)
{% endopenapi %}
---
# Source: https://docs.snyk.io/snyk-platform-administration/user-roles/custom-role-templates/auditor-role-template.md
# Auditor role template
This is a Group-level read-only role, meaning an Auditor can only view certain areas and functions in Snyk and cannot create PRs, Projects, and more.
This role can view issues, results of scans, and reports. An Auditor often verifies that there is a scan snapshot for a particular resource or Snyk Project. The Auditor may be external to the company.
## Group-level permissions
To create this role, enable the following permissions in the relevant categories:
### Group Management
Permission Enabled? View Groups true Edit Group details false View Group settings false Edit settings false View Group notification settings false Edit Group notification settings false
### Organization management
Permission Enabled? View Organizations true Edit Organizations false Remove Organizations false
### Snyk Essentials management
Permission Enabled? View Snyk Essentials true Edit Snyk Essentials false
### Audit Log management
Permission Enabled? View Audit Logs true
### Insights management
Permission Enabled? Access Insights true
### Reports management
Permission Enabled? View reports true
### Security and License Policies
Permission Enabled? View Policies true Create Policies false Edit Policies false Delete Policies false
### User management
Permission Enabled? View users true Invite users false Manage users false Add users false Provision users false User Leave false User Remove false
The remaining categories of permissions listed below should have all permissions within them set to disabled:
* IaC settings management
* Issue management
* Request access management
* Role management
* Service account management
* Snyk Apps management
* Snyk Preview management
* SSO settings management
* Tags management
## Organization-level permissions
To create this role, enable the following permissions in the relevant categories:
### Organization management
Permission Enabled? View Organization true Edit Organization false Remove Organization false
### Audit Log management
Permission Enabled? View audit logs true
### Collection management
Permission Enabled? View Collections true Create Collection false Edit Collections false Delete Collections false
### Container Image management
Permission Enabled? View container image true Create container image false Edit container image false
### Integration management
Permission Enabled? View integrations true Edit integrations false
### Project management
Permission Enabled? View Project true Add Project false Edit Project false Edit Project status false Test Project false Move Project false Remove Project false View Project history true Edit Project integrations false Edit Project attributes false View Jira issues true Create Jira issues false Edit Project Tags false
### Project Ignore management
Permission Enabled? View Project Ignores true Create Project Ignores false Edit Project Ignores false Remove Project Ignores false
### Reports management
Permission Enabled? View Organization reports true
### Snyk Cloud management
Permission Enabled? View environments false Create environments false Delete environments false Update environments false View scans true Create scans false View resources true View artifacts true Create artifacts false View Custom Rules false Create Custom Rules false Edit Custom Rules false Delete Custom Rules false
### Webhook management
Permission Enabled? View Outbound Webhooks true Create Outbound Webhooks false Remove Outbound Webhooks false
The remaining categories of permissions listed below should have all permissions within them set to disabled:
* Billing management
* Entitlement management
* Kubernetes Integration management
* Package management
* Project pull request management
* Service account management
* Snyk Apps management
* Snyk Preview management
* User management
---
# Source: https://docs.snyk.io/integrations/snyk-studio-agentic-integrations/quickstart-guides-for-snyk-studio/augment-code-guide.md
# Augment Code guide
Add Snyk Studio to Augment Code to secure code generated with agentic workflows through a Large Language Model (LLM). You can achieve this in several ways. When you use it for the first time, Snyk Studio asks for trust and, if necessary, trigger authentication.
## Prerequisites
* [Install the code assistant extension](https://www.augmentcode.com/)
* [Install the Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/install-or-update-the-snyk-cli)
* [Install the Snyk MCP](#install-the-snyk-mcp-server-in-augment-code)
### Install Augment Code
Visit the [Augment Code](https://www.augmentcode.com/) website to download the correct version of the IDE plugin.
### Install the Snyk MCP Server in Augment Code
Install the Snyk MCP Server using the method that best suits your operating system and local development environment.
#### Install with Node.js and npx
* Name: Snyk
* Command: npx -y snyk\@latest mcp -t stdio
#### Install with pre-installed Snyk CLI
* Name: Snyk
* Command: /absolute/path/to/snyk mcp -t stdio
If the `snyk` command is not available, add it by following the instructions on the [Installing or updating the Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/install-or-update-the-snyk-cli) page.
## Setting up the Snyk MCP Server
The MCP Server starts automatically. The following example shows a Snyk MCP Server that was successfully configured and started.
As a one-time setup, authenticate and trust the current Project directory. If required, the agentic workflow will likely manage this automatically.
If you are already authenticated with the Snyk CLI installed on your system path, or through a prior interaction, then proceed to work with the Snyk MCP Server.
## Examples
### Scanning for security vulnerabilities
In the chat, you can tag specific files or functions and ask Augment Code to scan your code for safe deployment to production.
```
Scan this directory for code security & dependency vulnerabilities and security issues
```
Augment Code indicates that this request is related to security vulnerability scanning and calls the Snyk MCP Server for various scans.
### "Secure at inception" rules
To practice [Secure at Inception](https://docs.snyk.io/discover-snyk/getting-started/glossary#secure-at-inception), Snyk recommends adopting rules that align the LLM for secure code generation and workflows.
The following is a suggested instruction for Augment Code user guidelines:
```
- Always run Snyk Code scanning tool for new first party code generated.
- Always run Snyk SCA scanning tool for new dependencies or dependency updates.
- If any security issues are found based on newly introduced or modified code or dependencies, attempt to fix the issues using the results context from Snyk.
- Rescan the code after fixing the issues to ensure that the issues were fixed and that there are no newly introduced issues.
- Repeat this process until no issues are found.
```
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/commands/auth.md
# Auth
## Usage
`snyk auth [] []`
## Description
The `snyk auth` command authenticates your machine to associate the Snyk CLI with your Snyk account.
Running `$ snyk auth` opens a browser window with prompts to log in to your Snyk account and authenticate. No repository permissions are needed at this stage, only your email address.
When you have authenticated, you can start using the CLI; see [Getting started with the CLI](https://docs.snyk.io/snyk-cli/getting-started-with-the-cli)
**Note:** Beginning with version 1.1293, the Snyk CLI uses OAuth when authenticating through the browser.
OAuth provides improved security by issuing shorter-lived expiring authorizations with the convenience of automatic refresh.
Earlier versions of the Snyk CLI (< 1.1293) obtained a non-expiring API token through a legacy browser interaction.
The Snyk API token can still be used as a fallback option. You must explicitly add an option to enable it as follows: `snyk auth --auth-type=token`.
## Options
### `--auth-type=`
Specify the \ of authentication to use. Supported types are `oauth` (the default beginning with version 1.1293.0) AND `token`.
### `--client-secret=` and `--client-id=`
You can set the client secret, and the id can be set in order to use the OAuth2 Client Credentials Grant.
Both values must be provided together. They are only valid together with `--auth-type=oauth;`otherwise they will be ignored.
For information about how to get the `` and the ``, see [Service accounts using OAuth 2.0](https://docs.snyk.io/enterprise-setup/service-accounts/service-accounts-using-oauth-2.0#oauth-2.0-with-client-secret)
## Token value
In some environments and configurations, you must use the ``; see [Authenticate to use the CLI](https://docs.snyk.io/snyk-cli/authenticate-to-use-the-cli)
The value may be a user token or a service account token; see [Service accounts](https://docs.snyk.io/enterprise-setup/service-accounts)
In a CI/CD environment, use the `SNYK_TOKEN` environment variable; see [Configure the Snyk CLI](https://docs.snyk.io/snyk-cli/configure-the-snyk-cli)
After setting this environment variable, you can use CLI commands.
## Debug
Use the `-d` option to output the debug logs.
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-container/kubernetes-integration/install-the-snyk-controller/authenticate-to-private-container-registries.md
# Authenticate to private container registries
If you are using private container registries, you must create a `dockercfg.json` file that contains the credentials to the registry. Then you must create a secret, which must be called `snyk-monitor`.
The `dockercfg.json` file is necessary to allow the monitor to look up images in private registries. Usually, your credentials are in `$HOME/.docker/config.json`. However, the credentials must also be added to the `dockercfg.json` file. The Snyk Controller is not able to access these registries if the credentials are only stored in `$HOME/.docker/config.json`
The steps below explain how to authenticate to private container registries.
## Configure the dockercfg.json file
Create a file named `dockercfg.json`. Store your credentials in this file.
{% hint style="info" %}
Ensure the file containing your credentials is named `dockercfg.json`. This filename is required by the `snyk-monitor`.
{% endhint %}
{% hint style="info" %}
Ensure the formatting is correct, including new line characters and whitespace in the `dockercfg.json` file. Malformed files will result in authentication failures.
{% endhint %}
The locations where your cluster runs and where your registries run determine the combination of entries in your `dockercfg.json` file. The file can contain credentials for multiple registries.
If your credentials are already in `$HOME/.docker/config.json`, copy this information to the `dockercfg.json` file.
If the `auth` entry is empty in the `$HOME/.docker/config.json`, run the following command and paste the output to `auth` entry in `dockercfg.json`:
```
echo -n 'username:password' | base64
```
### Examples of dockercfg.json file configuration
#### For private registries other than Nexus
If your cluster does not run on `GKE`, or it runs on `GKE` and pulls images from other private registries, your`dockercfg.json` file must contain:
```json
{
"auths": {
"gcr.io": {
"auth": "BASE64-ENCODED-AUTH-DETAILS"
},
// Add other registries as necessary, for example:
".azurecr.io": {
"auth": "BASE64-ENCODED-AUTH-DETAILS"
}
}
}
```
#### For Nexus Repository
If you are using Nexus Repository\*\*,\*\* your `dockercfg.json` file must contain:
```json
{
"auths": {
"": {
"auth": "BASE64-ENCODED-AUTH-DETAILS"
},
}
}
```
#### For Artifactory Container Registry
If you are using Artifactory Container Registry to host multiple private repositories\*\*,\*\* your `dockercfg.json` file must contain:
```json
{
"auths": {
"": {
"auth": "BASE64-ENCODED-AUTH-DETAILS"
},
"": {
"auth": "BASE64-ENCODED-AUTH-DETAILS"
}
}
}
```
#### For GKE using GCR
If your cluster runs on `GKE` and you are using `GCR`, your`dockercfg.json` file must contain:
```json
{
"credHelpers": {
"us.gcr.io": "gcloud",
"asia.gcr.io": "gcloud",
"marketplace.gcr.io": "gcloud",
"gcr.io": "gcloud",
"eu.gcr.io": "gcloud",
"staging-k8s.gcr.io": "gcloud"
}
}
```
#### For GKE using Google Artifact Registry (GAR)
If your cluster runs on `GKE` and you are using `GAR`, your`dockercfg.json` file must contain:
```json
{
"auths": {
"northamerica-northeast2-docker.pkg.dev": {
"auth": ")" | base64"}
}
}
```
This method relies on creating a service account. See [Google Cloud service account key](https://cloud.google.com/artifact-registry/docs/docker/authentication#json-key). Ensure you have the raw key saved to a file.
#### For EKS using ECR
If your cluster runs on `EKS` and you are using `ECR`, add the following:
```json
{
"credsStore": "ecr-login"
}
```
To use this credential helper for a specific `ECR` registry, create a credHelpers section with the URI of your ECR registry:
```json
{
"credHelpers": {
"public.ecr.aws": "ecr-login",
".dkr.ecr..amazonaws.com": "ecr-login"
}
}
```
#### For AKS using ACR
If your cluster runs on `AKS` and you're using `ACR`, add the following:
```json
{
"credHelpers": {
"myregistry.azurecr.io": "acr-env"
}
}
```
{% hint style="info" %}
In addition, for clusters running on AKS and using ACR, see [Entra ID Workload Identity service account](https://azure.github.io/azure-workload-identity/docs/topics/service-account-labels-and-annotations.html#service-account). It is possible that you are required to configure labels and annotations on the `snyk-monitor` ServiceAccount.
{% endhint %}
You can configure different credential helpers for different registries.
## Create the Kubernetes secret
Create the secret in Kubernetes by running the following command:
{% code overflow="wrap" %}
```sh
kubectl create secret generic snyk-monitor -n snyk-monitor \
--from-file=./dockercfg.json \
--from-literal=integrationId=abcd1234-abcd-1234-abcd-1234abcd1234 \
--from-literal=serviceAccountApiToken=aabb1212-abab-1212-dcba-4321abcd4321
```
{% endcode %}
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/authenticate-to-use-the-cli.md
# Authenticate to use the CLI
To scan your projects, you must authenticate with Snyk.
Snyk supports the following protocols for authentication:
* OAuth 2.0 (Recommended)
* Personal Access Token
* Snyk API token (Legacy)
{% hint style="info" %}
If you are not in the system default environment, `SNYK-US-01`, use the [`snyk config environment`](https://docs.snyk.io/developer-tools/snyk-cli/commands/config-environment) command to set your environment before you run [`snyk auth`](https://docs.snyk.io/developer-tools/snyk-cli/commands/auth).
{% endhint %}
## How to authenticate to use the CLI locally
### Steps to authenticate using OAuth 2.0 protocol
When you are using the CLI locally, **Snyk recommends that you use the OAuth 2.0 protocol.** Follow these steps:
1. Run the `snyk auth` CLI command.
2. Log in if you are prompted to do so.
3. The next page asks for your authorization for the CLI to act on your behalf. Click **Grant app access**.
4. When you have authenticated successfully, a confirmation message appears. Close the browser window and return to the CLI in the terminal.
After authentication is granted, a pair of access and refresh tokens are stored locally for future use.
Multi-tenant users who do not belong to the `SNYK-US-01` region ( `https://api.snyk.io`) will be automatically redirected to the correct domain for the email with which the user authenticated. This redirect will not happen if users are expected to use a custom URL, such as in single-tenant company configurations.
If you have problems, see [OAuth 2.0 authentication does not work](https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/troubleshooting-ides/how-to-set-environment-variables-by-operating-system-os-for-ides-and-cli-1).
{% hint style="info" %}
OAuth 2.0 tokens are not static. You cannot copy these tokens from your Snyk account page.
{% endhint %}
### Steps to authenticate using Personal Access Tokens
{% hint style="warning" %}
This method is inferior to the OAuth method.
{% endhint %}
{% hint style="warning" %}
The Personal Access Token (PAT) authentication is progressively rolled out to all Enterprise customers. To check if this feature is available for your Organization at this time, reach out to your Snyk account team.
{% endhint %}
When using this feature, ensure you generate and use a Personal Access Token (PAT). This feature is not compatible with Service Account tokens, and using them may result in unexpected behavior or errors.
{% hint style="info" %}
Whenever you use this feature in your IDE, ensure to also retrieve the PAT details from the Snyk Web UI. Contact Snyk Support to enable the PAT feature within your Snyk Web UI Organization.
{% endhint %}
Follow these steps to authenticate using your Snyk Personal Access Token:
1. Create your **Personal Access** **Token**. For details, see the [Authentication for API](https://docs.snyk.io/snyk-api/authentication-for-api) page.
2. Run the `snyk auth ` CLI command, supplying your Personal Access Token as a command arg.
3. After you successfully authenticate, the PAT is stored locally for future use.
All subsequent commands requiring Snyk authorization will use the configured PAT.
### Steps to retrieve the Snyk API token and use it to authenticate
{% hint style="warning" %}
This method is inferior to the OAuth 2.0 method.
{% endhint %}
Follow these steps to authenticate using your Snyk API token:
1. Run the`snyk auth --auth-type=token` CLI command.
2. Log in, if required.
3. The next page prompts you to authenticate your machine to associate the Snyk CLI or the IDE plugin with your account. Click **Authenticate**.
4. After you successfully authenticate, a confirmation message appears. Close the browser window and return to the CLI in the terminal.
After you complete the dialog, the API token is stored locally for future use.
All subsequent `test` commands will be authenticated automatically.
### Steps to authenticate using a known Snyk API token
You can copy your personal API token from your **General Account settings** (under your username) in the Snyk Web UI, and then configure your CLI to use it locally.
All CLI `test` commands can automatically recognize the environment variable `SNYK_TOKEN` and use it for authentication. For details, see [Environment variables for Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/configure-the-snyk-cli/environment-variables-for-snyk-cli).
To use API token-based authentication, set the `SNYK_TOKEN` environment variable and run the `test` command, for example:\
`SNYK_TOKEN= snyk test`
Alternatively, you can export the environment variable to make it available for future `test` commands:\
`export SNYK_TOKEN=`\
`snyk test`
This form of authentication is particularly useful for CI/CD pipelines. See [How to authenticate to use the CLI in CI/CD pipelines](#how-to-authenticate-to-use-the-cli-in-ci-cd-pipelines).
You can also store the Snyk API token locally for later use by running the following CLI command:\
`snyk auth `
All subsequent test calls will be authenticated automatically. For more information, see the [Auth command help](https://docs.snyk.io/developer-tools/snyk-cli/commands/auth).
## How to authenticate to use the CLI in CI/CD pipelines
Free and Team plan users are more likely to use this method in a CI/CD pipeline than to use OAuth 2.0. Enterprise plan customers are advised to use a [**service account**](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts) in a CI/CD pipeline. For details about when to use a PAT and when to use a service account token, see [Authentication for API](https://docs.snyk.io/snyk-api/authentication-for-api).
All CLI `test` commands can automatically recognize the environment variable `SNYK_TOKEN` and use it for authentication. For details, see [Environment variables for Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/configure-the-snyk-cli/environment-variables-for-snyk-cli).
To use PAT-based authentication, set the `SNYK_TOKEN` environment variable and run the `test` command, for example:\
`SNYK_TOKEN= snyk test`
Alternatively, you can export the environment variable to make it available for future `test` commands:\
`export SNYK_TOKEN=`\
`snyk test`
You can also store the Snyk PAT locally for later use by running the following CLI command:\
`snyk auth `
All subsequent test calls will be authenticated automatically. For more information, see the [Auth command help](https://docs.snyk.io/developer-tools/snyk-cli/commands/auth).
---
# Source: https://docs.snyk.io/snyk-api/authentication-for-api.md
# Authentication for API
{% hint style="info" %}
**Feature Availability**
To use the Snyk API, you must be an Enterprise plan customer and have a token from Snyk.
{% endhint %}
{% hint style="warning" %}
Use the URL for your region when calling an API. See [API URLs](https://docs.snyk.io/rest-api/about-the-rest-api#api-urls).
{% endhint %}
Enterprise users have [access to a personal token under their profile](#how-to-obtain-your-personal-token) and to service account tokens. The personal API token is associated with your Snyk Account and not with a specific Organization. Service accounts are associated with an Organization or a Group. For more information, see [Service accounts](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts).
* **Enterprise users should use a service account** to authenticate for any kind of automation. This includes, but is not limited to, CI/CD scanning with the CLI or build system plugins and any automation, including automation with the API.
* **Enterprise users should use the personal token** under their user profile for:
* Running the CLI locally on their machine; for details, see [Authenticate to use the CLI](https://docs.snyk.io/developer-tools/snyk-cli/authenticate-to-use-the-cli).
* Authenticating with the IDE manually
* Running API calls one time, for example, to test something
Note that for free and team plan users, the personal token does not have access to the API and may be used for authenticating to IDE, CLI, and CI/CD integrations only. For details, see [Obtain and use your API token](https://docs.snyk.io/discover-snyk/getting-started#obtain-and-use-your-snyk-api-token).
For additional information, see [Snyk API token permissions users can control](https://docs.snyk.io/snyk-api/authentication-for-api/snyk-api-token-permissions-users-can-control).
## How to obtain your personal token
You can find your personal API token in your personal [personal account settings](https://app.snyk.io/account) after you register with Snyk and log in. In the **key** field, **Click to show**. Then highlight and copy the API key.
If you want a new API token, select **Revoke & Regenerate.** This will make the previous API token invalid. For details, see [Revoke and regenerate a Snyk API token](https://docs.snyk.io/snyk-api/authentication-for-api/revoke-and-regenerate-a-snyk-api-token).
## Authenticating with a personal API token
When using the API directly, provide the API token in an `Authorization: token` header, as in the following example request, replacing `API_TOKEN` with your token
```bash
curl --request GET \
--url "https://api.snyk.io/rest/self?version=2024-06-10" \
--header "Content-Type: application/vnd.api+json" \
--header "Authorization: token API_TOKEN"
```
## Authenticating for Snyk Apps
If you are using the [Snyk Apps APIs](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis), provide the `access_token` in an `Authorization: bearer` header as follows:
```bash
curl --request GET \
--url "https://api.snyk.io/rest/self?version=2024-06-10" \
--header "Content-Type: application/vnd.api+json" \
--header "Authorization: bearer API_TOKEN"
```
Otherwise, a `401 Unauthorized` error will be returned.
---
# Source: https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/eclipse-plugin/authentication-for-the-eclipse-plugin.md
# Authentication for the Eclipse plugin
To scan your Projects, you must authenticate with Snyk.
Snyk supports the following protocols for authentication:
* OAuth 2.0 (Recommended)
* Personal Access Token
* Snyk API token (Legacy)
Authentication methods available in the Snyk plugin in Eclipse
## Steps to authenticate using the OAuth 2.0 protocol
After the plugin is installed, follow these steps to authenticate:
1. In the dialog that opens, set the Snyk API endpoint for a custom multi-tenant or single-tenant setup. For details, see [IDE URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#ides-urls).\
\
Multi-tenant users who do not belong to the `SNYK-US-01` region ( `https://api.snyk.io`) will be automatically redirected to the correct domain for the email with which the user authenticated. This redirect will not happen if users are expected to use a custom URL, such as in single-tenant company configurations.\
\
When you are finished with the settings on this page, click **Next**.
Snyk endpoint configuration
2. On the next page, follow the prompts, then click **Finish**.
Additional information and finish
3. A new browser page opens, requiring you to log in to your Snyk account.
4. In the next prompt, the Snyk IDE plugin requests access to act on your behalf. Click **Grant app access**.
5. After you have successfully authenticated, a confirmation message appears. Close the browser window and return to the IDE.
The analysis starts automatically. The IDE reads and saves the authentication tokens on your local machine.
{% hint style="info" %}
OAuth 2.0 tokens are not static and cannot be copied from the Snyk account page.
{% endhint %}
If you have problems, see [OAuth 2.0 authentication does not work](https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/troubleshooting-ides/how-to-set-environment-variables-by-operating-system-os-for-ides-and-cli-1).
## Steps to authenticate using your Personal Access Token
{% hint style="warning" %}
This method is inferior to the OAuth method.
{% endhint %}
{% hint style="warning" %}
The Personal Access Token (PAT) authentication is progressively rolled out to all Enterprise customers. To check if this feature is available for your Organization at this time, please reach out to your Snyk account team.
{% endhint %}
To authenticate using the Personal Access Token, follow these steps:
1. Navigate to **Eclipse** > **Settings** > **Snyk**.\
(On Windows/Linux navigate to **Window** > **Preferences** > **Snyk**)
2. Set the **Authentication Method** to **Use Personal Access Token**.
3. Click the **Connect IDE to Snyk** button.
4. Create your **Personal Access** **Token**. For details, see the [Authentication for API](https://docs.snyk.io/snyk-api/authentication-for-api) page.
5. Add the token in the **Token** field.
6. Click **Apply and Close.**
The analysis starts automatically.
## Steps to authenticate using your Snyk API token
{% hint style="warning" %}
This method is inferior to the OAuth method.
{% endhint %}
To authenticate using the API token, follow these steps:
1. Navigate to **Eclipse** > **Settings** > **Snyk**.\
(On Windows/Linux navigate to **Window** > **Preferences** > **Snyk**)
2. Set the **Authentication Method** to **API token**.
3. Click the **Connect IDE to Snyk** button.
4. Click **Authenticate** in the web browser window that opens.
5. The API token is automatically updated in the **API Token field**.
6. Click **Apply and Close.**
The analysis starts automatically.
{% hint style="info" %}
Alternatively, copy the personal API token from your Snyk Web UI instance (default is [https://app.snyk.io](https://app.snyk.io/)). Paste the token in the **API Token** field. For details, see [Obtain and use your Snyk API token](https://docs.snyk.io/discover-snyk/getting-started#obtain-and-use-your-snyk-api-token).
{% endhint %}
## How to switch accounts
To re-authenticate with a different account, follow these steps:
1. Navigate to **Preferences** > **Snyk**.
2. Clear the value in the **Token** field.
3. Click **Apply and Close**.
4. When you have logged out, start authentication again from the beginning.
---
# Source: https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/jetbrains-plugin/authentication-for-the-jetbrains-plugins.md
# Authentication for the JetBrains plugin
To scan your Projects, you must authenticate with Snyk.
Snyk supports the following protocols for authentication:
* OAuth 2.0 (Recommended)
* Personal Access Token
* Snyk API token (Legacy)
{% hint style="warning" %}
Before authenticating, ensure your region is properly set. For more details, see [IDEs URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#ides-urls).
{% endhint %}
Authentication methods available in the Snyk plugin in Jetbrains IDEs
## Steps to authenticate using the OAuth 2.0 protocol
Follow the next steps to authenticate:
1. After the extension is installed, click the Snyk icon in the navigation bar, then click **Trust project and scan**.
Snyk icon and connect and trust
2. A new browser window opens, requiring you to log in to your Snyk account.
3. In the next prompt, the Snyk IDE plugin requests access to act on your behalf. Click **Grant app access**.
4. When you have authenticated successfully, a confirmation message appears. Close the browser window and return to the IDE.
The analysis starts automatically. The IDE reads and saves the authentication tokens on your local machine.
{% hint style="info" %}
OAuth 2.0 tokens are not static and cannot be copied from the Snyk account page.
{% endhint %}
If you have problems, see [OAuth 2.0 authentication does not work](https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/troubleshooting-ides/how-to-set-environment-variables-by-operating-system-os-for-ides-and-cli-1).
## Steps to authenticate using your Personal Access Token
{% hint style="warning" %}
This method is inferior to the OAuth method.
{% endhint %}
{% hint style="warning" %}
The Personal Access Token (PAT) authentication is progressively rolled out to all Enterprise customers. To check if this feature is available for your Organization at this time, please reach out to your Snyk account team.
{% endhint %}
To authenticate using the Personal Access token, follow these steps:
1. Navigate to **Settings** > **Tools** > **Snyk**.
2. Set the **Authentication Method** to **Use Personal Access Token**.
3. Click the **Connect IDE to Snyk** button.
4. Create your **Personal Access** **Token**. For details, see the [Authentication for API](https://docs.snyk.io/snyk-api/authentication-for-api) page.
5. Add the token in the **Token** field.
6. Click **Apply and Close.**
## Steps to authenticate using your Snyk API token
{% hint style="warning" %}
This method is inferior to the OAuth method.
{% endhint %}
To authenticate, follow these steps:
1. In the JetBrains plugin, navigate to **Settings** > **Tools** > **Snyk**.
2. Set the **Authentication Method** to **API token**.
3. Click the **Connect IDE to Snyk** button.
4. Click **Authenticate** in the web browser window that opens.
5. The API token is automatically updated in the **API Token field**.
6. Click **Apply** or **OK.**
The analysis starts automatically.
{% hint style="info" %}
Alternatively, copy the personal API token from your Snyk Web UI instance (default is [https://app.snyk.io](https://app.snyk.io/)). Paste the token in the **API Token** field. For details, see [Obtain and use your Snyk API token](https://docs.snyk.io/discover-snyk/getting-started#obtain-and-use-your-snyk-api-token).
{% endhint %}
## How to switch accounts
To re-authenticate with a different account, follow these steps:
1. In the JetBrains plugin, navigate to **Settings** > **Tools** > **Snyk**.
2. Clear the value of the **Token** field.
3. Click **Apply** or **OK**.
4. When you have logged out, start authentication again from the beginning.
## Requirements for Linux and Unix
When authenticating with Snyk, users have the option to copy the authentication URL to their clipboard.
For Linux and Unix users, this requires the `xclip` or `xsel` utility to be installed.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/authentication-for-third-party-tools.md
# Authentication for third-party tools
When you work with Snyk from within any third-party tool, Snyk requires authentication in order to initiate its processes.
Snyk offers API tokens to enable integrations with third-party developer tools. You can authenticate through your personal account using your personal token or through a [service account](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts) using the token associated with that account. When you authenticate through a service account, you do not use any personal token.
{% hint style="info" %}
For authentication purposes, the third-party identity providers do not require access to your repositories, only your email address.
{% endhint %}
## Supported identity providers
You can use one of the following identity providers for authentication with Snyk::
* GitHub
* Bitbucket
* Google
* Entra ID (formerly Azure AD)
* Docker ID
* Single Sign-On (SSO): available with Enterprise plans.\
See [Setting up Single Sign-On (SSO) for authentication](https://docs.snyk.io/implementation-and-setup/enterprise-setup/single-sign-on-sso-for-authentication-to-snyk).
For additional instructions, see the integrations pages for [Git repositories (SCMs)](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations).
{% hint style="info" %}
Logging in with a different provider from the one you registered with when you first created your Snyk account will create a separate new Snyk account.
{% endhint %}
## **How to authenticate for a third-party tool using your personal token**
1. Visit [your Snyk account](https://app.snyk.io/account).
2. Navigate to **General Account Settings** and copy your token.
3. In the token field, click to show and then select and copy your API token.
4. In the third-party interface, configure your integration by pasting your Snyk token when prompted.
API token screen; revoke; regenerate; click to show
---
# Source: https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/visual-studio-code-extension/authentication-for-visual-studio-code-extension.md
# Authentication for Visual Studio Code extension
To scan your Projects, you must authenticate with Snyk.
Snyk supports the following protocols for authentication:
* OAuth 2.0 (Recommended)
* Personal Access Token
* Snyk API token (Legacy)
For all authentication methods, Snyk uses the [Secret Storage API](https://code.visualstudio.com/api/references/vscode-api#SecretStorage) to store the token securely. This storage uses the keychain of the system to manage the token.
{% hint style="warning" %}
Before authenticating, ensure you have set your region properly. For details, see [IDEs URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#ides-urls).
{% endhint %}
## Steps to authenticate using the OAuth 2.0 protocol
Follow these steps to authenticate:
1. After the extension is installed, click the **Snyk Icon** in the navigation bar, then click **Connect & Trust Workspace**:
Connect and trust workspace
2. A new browser window opens, requiring you to log in to your Snyk account.
3. In the next prompt, the Snyk IDE extension requests access to act on your behalf. Click **Grant app access**.
4. When you have authenticated successfully, a confirmation message appears. Close the browser window and return to the IDE.
5. The IDE reads and saves the authentication on your local machine. Close the browser window and return to the IDE.
The analysis starts automatically. The IDE reads and saves the authentication on your local machine.
{% hint style="info" %}
OAuth 2.0 tokens are not static and cannot be copied from the Snyk account page.
{% endhint %}
If you have problems, see [OAuth 2.0 authentication does not work](https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/troubleshooting-ides/how-to-set-environment-variables-by-operating-system-os-for-ides-and-cli-1).
## Steps to authenticate using your Personal Access Token
{% hint style="warning" %}
This method is inferior to the OAuth method.
{% endhint %}
{% hint style="warning" %}
The Personal Access Token (PAT) authentication is progressively rolled out to all Enterprise customers. To check if this feature is available for your Organization at this time, please reach out to your Snyk account team.
{% endhint %}
When using this feature, ensure you generate and use a Personal Access Token (PAT). This feature is not compatible with Service Account tokens, and using them may result in unexpected behavior or errors.
{% hint style="info" %}
Whenever you use this feature in your IDE, ensure to also retrieve the PAT details from the Snyk Web UI. Contact Snyk Support to enable the PAT feature within your Snyk Web UI Organization.
{% endhint %}
To authenticate using the Personal Access token, follow these steps:
1. Click the **Snyk Icon** in the navigation bar, then click the **Settings** icon, find **Authentication Method,** and change it to **Personal Access Token**.
2. Create your **Personal Access** **Token**. For details, see the [Authentication for API](https://docs.snyk.io/snyk-api/authentication-for-api) page.
3. Run the `Snyk: Set Token` command and paste the token in the text field.
## Steps to authenticate using your Snyk API token
{% hint style="warning" %}
This method is inferior to the OAuth method.
{% endhint %}
Follow these steps to authenticate:
1. After the extension is installed, click the **Snyk Icon** in the navigation bar, then click the **Settings** icon, find **Authentication Method,** and change it to **Token authentication**:
Change authentication method
2. Press **Connect & Trust Workspace**.
3. Click **Authenticate** in the web browser window that opens.
The analysis starts automatically.
{% hint style="info" %}
Alternatively, run the `Snyk: Set Token` command and paste the token in the text field.
{% endhint %}
Set token manually
## How to switch accounts
To re-authenticate with a different account, follow these steps:
1. Run the provided `Snyk: Log Out` command.
Snyk: Log out
2. When you have logged out, start authentication again from the beginning.
## Requirements for Linux and Unix
When authenticating with Snyk, users have the option to copy the authentication URL to their clipboard.
For Linux and Unix users, this requires the `xclip` or `xsel` utility to be installed.
---
# Source: https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/visual-studio-extension/authentication-for-visual-studio-extension.md
# Authentication for Visual Studio extension
To scan your Projects, you must authenticate with Snyk.
Snyk supports the following protocols for authentication:
* OAuth 2.0 (Recommended)
* Personal Access Token
* API token (Legacy)
Authentication methods available in the Snyk extension in Visual Studio
## Steps to authenticate using the OAuth 2.0 protocol
Follow the next steps to authenticate:
1. After the extension is installed, navigate to **Extensions** > **Snyk** > **Windows**, and then **Snyk** to open the Snyk panel.
Snyk extension navigation
2. On the welcome screen, click **Trust project and scan.**
Trust project and scan
3. A new browser window opens, requiring you to log in to your Snyk account.
4. In the next prompt, the Snyk IDE extension requests access to act on your behalf. Click **Grant app access**.
5. When you have authenticated successfully, a confirmation message appears. Close the browser window and return to the IDE.
The analysis starts automatically. The IDE reads and saves the authentication on your local machine.
{% hint style="info" %}
OAuth 2.0 tokens are not static and cannot be copied from the Snyk account page.
{% endhint %}
If you have problems, see [OAuth 2.0 authentication does not work](https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/troubleshooting-ides/how-to-set-environment-variables-by-operating-system-os-for-ides-and-cli-1).
## Steps to authenticate using your Personal Access Token
{% hint style="warning" %}
This method is inferior to the OAuth method.
{% endhint %}
{% hint style="warning" %}
The Personal Access Token (PAT) authentication is progressively rolled out to all Enterprise customers. To check if this feature is available for your Organization at this time, please reach out to your Snyk account team.
{% endhint %}
To authenticate using the Personal Access token, follow these steps:
1. Navigate to **Preferences** > **Snyk**.
2. Set the flag to **Use Personal Access Token.**
3. Click the **Connect IDE to Snyk** button.
4. Create your **Personal Access** **Token**. For details, see the [Authentication for API](https://docs.snyk.io/snyk-api/authentication-for-api) page.
5. Paste or enter the token in the **Token** field.
6. Click **Apply and Close.**
## Steps to authenticate using your Snyk API token
{% hint style="warning" %}
This method is inferior to the OAuth method.
{% endhint %}
Follow these steps to authenticate:
1. After the extension is installed, navigate to **Extensions > Snyk** > **Settings**:
Snyk Settings navigation
2. Find the **Authentication Method** and change it to **API Token** authentication.
3. Click the **Connect IDE to Snyk** button.
4. Click **Authenticate** in the web browser window that opens.
5. The API token is automatically updated in the **API Token field**.
6. Click **Apply and Close.**
The analysis starts automatically.
{% hint style="info" %}
Alternatively, copy the personal API token from your Snyk Web UI instance (default is [https://app.snyk.io](https://app.snyk.io/)). Paste the token in the **API Token** field. For details, see [Obtain and use your Snyk API token](https://docs.snyk.io/discover-snyk/getting-started#obtain-and-use-your-snyk-api-token).
{% endhint %}
## How to switch accounts
To re-authenticate with a different account, follow these steps:
1. Navigate to **Extensions > Snyk > Settings.**
2. Clear the value of the **Token** field.
3. Click **OK**.
4. When you have logged out, start authentication again from the beginning.
---
# Source: https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions/troubleshooting-ides/authentication-using-api-token-does-not-work.md
# Authentication using API token does not work
If you **get an authentication error after you have seen the message that authentication was successful**, it may help to restrict allowed IP address families to either IPv4 or IPv6, so that requests always come from the same address during authentication.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/auto-provisioning-guide.md
# Auto-provisioning guide
{% hint style="info" %}
**Feature availability**
Auto-provisioning is available only for customers starting a Pilot or Enterprise plan.
{% endhint %}
Provisioning is the first interaction you have with Snyk before getting access. You provide answers to the following questions about how you will be using the platform:
* What is the name of your business?
* Where do you want your data to be hosted?
* What is the authentication method your users will access Snyk with?
* Do you already have a Snyk account you want to use (a previously completed Pilot) or do you want to start from scratch?
This guide covers the following aspects of automated provisioning:
1. [Prerequisites](#prerequisite-the-welcome-to-snyk-email)
2. [New account sign-up](#sign-up-start-from-scratch)
3. [Existing account sign-up](#logging-in-provision-using-an-existing-user-account)
4. [Error types and solutions](#error-types-and-solutions)
## Prerequisite: the "Welcome to Snyk" email
You should receive an email in your inbox containing links to start provisioning. Search for "Welcome to Snyk" as a subject, sent from .
If you have not received this email, look into your spam folder or reach out to your account executive.
This email contains two links**:**
1. **Log in and activate your existing account** - To be used if you already have an account and wish to apply your plan to it.
2. **Create and activate a new account** - To be used if you're entirely new to Snyk or want to start from scratch with a different user.
{% hint style="warning" %}
Once provisioning is complete, these links will become invalid and you will see an "Access denied" error page.
If you have not completed provisioning but still see this error, make sure someone else in your organization has not already completed the flow themselves, in case the welcome email has more recipients.
{% endhint %}
## Sign up - start from scratch
Clicking the sign-up link in your welcome email will take you to the sign-up page in the provisioning app.
{% hint style="warning" %}
The provisioning app is only accessible through a unique link, all other access is disabled and will show an error page.
{% endhint %}
Sign up page on provision.snyk.io
### Step 1: Enter the company name
The company name you enter here will be used to create the [Tenant](https://docs.snyk.io/snyk-platform-administration/groups-and-organizations/tenant), the top-level instance you'll see in the Snyk Platform. It is a required field and has 60-character limit.
Provisioning will also create a [Group](https://docs.snyk.io/snyk-platform-administration/groups-and-organizations/groups) and a default [Organization](https://docs.snyk.io/snyk-platform-administration/groups-and-organizations/organizations) using the same name.
### Step 2: Choose where to host the account
Available hosting regions
Snyk offers [regional hosting](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency) to comply with regional data protection laws and improve service performance. This ensures data residency requirements are met and reduces data latency.
Provisioning is enabled for these [three regions](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#available-snyk-regions). Each of these regions is running at least one instance of the Snyk Platform:
* :flag\_us: **United States**: SNYK-US-01, SNYK-US-02
* :flag\_eu: **Europe**: SNYK-EU-01
* :flag\_au: **Australia**: SNYK-AU-01
In the case of [multiple instances](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#regional-multi-and-single-tenant-hosting) being available in a chosen region (United States), Snyk reserves the right to chose the specific instance where your account will be created. For more information see [Regional Hosting and data residency](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency).
{% hint style="warning" %}
Automated provisioning is only possible for multi-tenant environments. For single-tenant availability (Snyk Private Cloud), reach out to your account team in advance of provisioning.
{% endhint %}
### Step 3: Select an authentication method
Available authentication methods
The available authentication methods are either Single sign-on (SSO) or Third-party authentication.
1. **Single Sign-On** - use your company's existing identity management system, see [Single Sign-On (SSO) for authentication to Snyk](https://docs.snyk.io/implementation-and-setup/enterprise-setup/single-sign-on-sso-for-authentication-to-snyk) for more details.
2. **Third-party authentication** - Snyk supports a list of third-party identity providers, see [Authentication for third-party tools](https://docs.snyk.io/implementation-and-setup/enterprise-setup/authentication-for-third-party-tools) for more details. This method is only available for the United States region.
#### Which authentication methods are available for each region?
| Region | Single Sign-On | Third-party providers |
| ------------------------ | -------------------- | -------------------------- |
| :flag\_us: United States | :heavy\_check\_mark: | :heavy\_check\_mark: |
| :flag\_eu: Europe | :heavy\_check\_mark: | :heavy\_multiplication\_x: |
| :flag\_au: Australia | :heavy\_check\_mark: | :heavy\_multiplication\_x: |
Snyk recommends selecting SSO since it is best supported across all environments. Selecting this option will then prompt you to enter a valid, work-issued email address, used to create an initial Snyk Admin user. No extra configuration for SSO is required at this point.
Email address input
### Step 4: Confirm details and start provisioning
As a final step, you must confirm the details entered are correct.
* If you have selected SSO as the authentication method, clicking "Sign up" will then show a loading page while Snyk does the background work.
* If you have selected Third-party authentication, clicking "Continue to sign up options" will redirect you to the Snyk Login page where you can choose your identity provider (Github, Google, and so on.). Once you have completed signing up you will be redirected back to the provisioning application where the loading page will indicate in-progress background work.
Snyk advises you to not close the page, otherwise you risk not seeing the process complete successfully.
### Step 5: Access the Snyk platform
If you have selected SSO as the authentication method, once plan activation is done, you will see a success message and a verification button. Snyk also sends an email to indicate a successful provisioning containing the same login verification link. This link does not expire and it can be used for multiple authentications if needed.\
\
Once clicked, a login code will be sent to the email address previously entered. This is known as a Passwordless Login. Enter the code where prompted and you are ready to start using Snyk!
Successful provisioning for SSO
If you have selected **Third-party authentication**, once plan activation is done you are all set! You can click "Continue to your account" and start using the platform.
{% hint style="info" %}
Your plan might not have all the features enabled if your contract's start date is in the future. You will gain full access to all features when the start date is met.
{% endhint %}
## Logging in - provision using an existing user account
Clicking the sign-up link in your welcome email will take you to the log in page in the provisioning app.
### Step 1: Logging in
If you have a user account connected through a third-party provider, you will need to use SNYK-US-01
You can find the links for all the regions in the [Login and Web UI URLs section](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#login-and-web-ui-urls).
Log in page on provision.snyk.io
### Step 2: Select an existing Tenant or start fresh
Linking to an existing Tenant or creating a new one
If you already have a Snyk User, you can choose how you activate your Enterprise plan or Pilot after logging in:
* Linking the plan to an existing [Tenant](https://docs.snyk.io/snyk-platform-administration/groups-and-organizations/tenant).\
If your user is a member of multiple Tenants, you have the option to choose between them. Click the card of the Tenant you wish to select and then click "Confirm and activate".
* Starting fresh with a new [Tenant](https://docs.snyk.io/snyk-platform-administration/groups-and-organizations/tenant) linked to your user.\
Click "Create new Tenant account" and enter the company name. It's the same field as [#step-1-enter-the-company-name](#step-1-enter-the-company-name "mention") of signing up. You will be asked to confirm the name you entered before starting the provisioning process.
### **Step 3: Access the Snyk platform**
This step is the same as [#step-5-access-the-snyk-platform](#step-5-access-the-snyk-platform "mention") when signing up. Once the process is done you can "Continue to your account" and begin using Snyk.
{% hint style="info" %}
Your plan might not have all the features enabled if your contract's start date is in the future. You will gain full access to all features when the start date is met.
{% endhint %}
## Error types and solutions
### Validation errors
When creating a new Tenant or User Snyk checks for duplicates and surfaces any issues.
* **The business name provided is already in use.** - Use a different name or reach out to your account executive if you want to link your plan to that existing Tenant but you are not a member of it.
Business name already in use error
* **An account with this email already exists.** - In this scenario you can use a different work email address or you can login ([#logging-in-provision-using-an-existing-user-account](#logging-in-provision-using-an-existing-user-account "mention")), then create the new Tenant.
User with the same email address already exists error
### Plan Activation errors
Snyk is doing its best to ensure that you never see this screen, but in case you do, save the **reference ID** and send it to your account executive or reach out to support with the reference ID and the steps taken.
Plan activation error
---
# Source: https://docs.snyk.io/snyk-platform-administration/snyk-projects/automatically-created-project-collections.md
# Automatically created Project collections
{% hint style="info" %}
**Release status**
Automated Collections are in Early Access and available only with Enterprise plans. For more information, see [plans and pricing](https://snyk.io/plans/).
{% endhint %}
See also [Limitations of Automated Collections](#limitations-of-automated-collections). You can enable Automated Collections in your Organization settings.
Scanning a repository through an SCM integration and rescanning it using the Snyk CLI creates duplicate Targets within the Snyk Web UI with duplicate Projects and issues. These may not be exact duplicates.
With the option to create Project collections automatically, Projects from these duplicate Targets will be detected and grouped automatically into a new collection. This helps identify duplicates and allows filtering and reporting on the issues of your preferred code-scanning method.
{% hint style="info" %}
Automated Collections takes up to an hour to create or delete a collection.
{% endhint %}
## Enable automatic Project collections
{% hint style="info" %}
Bitbucket Server Projects cannot be grouped into automatically created collections because of their dynamic repository URL schema.
{% endhint %}
Automated Collections can be turned on under the **Organization settings** by users with the appropriate permissions: **Organization Admins**, **Group Admins**, or users with permissions to edit **Organization settings**.
Managing Automated Collections under Organization Settings
After Automated Collections are enabled, all the Organization's Projects will be analyzed, and Projects with the same repository URL will be grouped automatically into a collection. All subsequent Project imports will also update the list of automatically created collections; there is no need to refresh the list manually.
## Turn off Automated Collections
Turning off Automated Collections will remove all the automatically created collections without impacting any of the grouped Projects. Only the automated collections are deleted, not their contents.
## Automatically versus manually created Project collections
An automatically created collection has the same filtering and reporting options as a manually created collection:
* You can filter an automatically created collection to see only the result of your preferred scanning method, thus hiding duplicates.
* You can create reports from an automatically created collection and track issues from just one of the scanning methods and ignore the rest.
Automatically created collections have no management options available:
* You cannot rename an automated collection; its name will reflect the repository URL of the matched Projects.
* You cannot add new Projects to an automated collection. Its content updates itself when new Projects with the same repo URL are imported.
* You cannot delete an automatically created collection. You can turn off the Automated Collections feature, which will remove all automatically created collections.
## Limitations of Automated Collections
* Automated Collections does not detect [SAST](https://docs.snyk.io/discover-snyk/getting-started/glossary#sast) scans pushed to the Snyk Web UI using the `snyk code test --report --project-name="name"` command.
* This feature supports only GitHub, GitHub Enterprise, GitLab, Bitbucket Cloud, and Azure integration scans.
* This feature does not support Snyk Container.
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-container/scan-your-dockerfile/automatically-link-your-dockerfile-with-container-images-using-labels.md
# Automatically link your Dockerfile with container images using labels
Snyk allows you to link manually or automatically from a Dockerfile to all container images built from it. You can use this to understand the security impact on your running applications and understand which images can be better secured or need to be rebuilt when you take action and update the Dockerfile base image.
## How linked images work
When it is imported or rescanned, the image is analyzed and scanned for vulnerabilities. Image labels are also retrieved from the image manifest. Snyk then checks that:
* Image labels defining the Dockerfile location exist:
* (Mandatory) `org.opencontainers.image.source` - URL to the Project repository, for example, `org.opencontainers.image.source="https://github.com/example/test"`
* (Optional) `io.snyk.containers.image.dockerfile` - path to the Dockerfile, for example, `io.snyk.containers.image.dockerfile="/app/Dockerfile-prod"` . This label is only required if:
* The Dockerfile is not in the root of your repository
* The Dockerfile is not named `Dockerfile`
* The Dockerfile Project exists in the same Organization, with a matching repository (and path or /Dockerfile) from the image labels.
If these conditions apply, Snyk automatically creates a link between the image and the Dockerfile Project.
## View linked images
You can see the linked images on the Project page, under **LINKED IMAGES.**
Project showing linked images
Using container registry integration, you can get automatic links between imported images to existing Dockerfile Projects. To do this, ensure the OCI label in the image matches the path of a Dockerfile in the Organization in Snyk.
## Automatically update and remove links
Links are automatically updated if the Dockerfile labels are updated and are targeting a new location. This can happen during a re-scan or during a recurring scan.
Links are removed if:
* The image Project or Dockerfile Project is deleted.
* The Dockerfile labels are updated so that they target the Dockerfile location without an existing Project in Snyk or
* The Dockerfile labels are removed.
## Create an automatic link with brokered SCM integrations
To create a link, Snyk must be able to map the Dockerfile repository URL to the right SCM Organization source. For brokered integrations, the process is more complex, as the URL is not available by default.
To create automatic links between container images to Dockerfiles stored in brokered SCMs, enter the URL in the integration settings page.
Integration settings page with integration URL
When the URL is available, Snyk can use it to generate links.
---
# Source: https://docs.snyk.io/manage-risk/reporting/available-snyk-reports.md
# Available Snyk reports
The following reports are available:
* [Issues Detail report](#issues-detail-report)
* [Issues Summary report](#issues-summary-report)
* [Vulnerabilities Detail report](#vulnerabilities-detail-report)
* [Featured Zero-Day report](#featured-zero-day-report)
* [SLA Management report](#sla-management-report)
* [OWASP TOP 10 report](#owasp-top-10-report)
* [CWE TOP 25 report](#cwe-top-25-report)
* [CWE TOP 10 KEV report](#cwe-top-10-kev-report)
* [PCI-DSS v4.0.1 report](#pci-dss-v4.0.1-report)
* [Developer IDE and CLI usage report](#developer-ide-and-cli-usage)
* [Repositories Tested in CI/CD report](#repositories-tested-in-ci-cd-report)
* [Cloud Compliance Issues report](#cloud-compliance-issues-report)
* [Learn Engagement](#learn-engagement)
* [Learning Impact & Opportunities](#learning-impact-and-opportunities)
* [Snyk Generated Pull Requests](#snyk-generated-pull-requests)
* [Asset Dashboard](#asset-dashboard)
* [Risk exposure report](#risk-exposure-report)
* [Saved Views](#saved-views)
* [PR Check Report](#pr-check-report)
Select **Change Report** to change the report displayed:
Select Change Report to display different reports
## Issues Detail report
The Issues Detail report displays all known issues in all of your Projects that are being monitored by Snyk. The report gives details about each issue and which of your Projects are affected and provides links to fix information.
The Issues Detail report displays the number of issues as well as the number of unique vulnerabilities that make up the issues.
Quick aggregations are available by categories including **Severity**, **Product Name**, and **Issue Type.**
Individual issues are displayed in a table according to the selected category. You can modify columns as needed.
For a table of only the unique vulnerabilities, use Change Report to switch to the Vulnerabilities Detail report.
## Issues Summary report
The Issues Summary report highlights the value that Snyk is providing by enabling both the identification and resolution of issues.
The report provides a glimpse into how well teams are optimizing the use of the Snyk platform for their workflow and provides a means to measure and improve security.
This report enables you to easily understand the current state and trends of the highest security risk items. This report also provides a quick view into where risk is coming from and where remediation efforts are most and least effective.
{% hint style="info" %}
Use the date filter in the upper right corner of the Issues Summary report to see key metrics and charts for a specified interval. The selected date range also impacts the compared period, which allows you to measure progress across various key metrics.
{% endhint %}
At the top of the report, you can follow key metrics associated with security issues in the selected date range with a comparison to the previous sequential period's results. This allows you to get insights on trends. See the tooltips in Snyk Web UI for definitions of the metrics.
The **Issues Identified and Resolved** trend captures the accumulated security issues that were identified and resolved during the selected date range. The gap between the two lines indicates the open issues backlog.
This visual trend allows you to identify if too many issues are being introduced, meaning that prevention should become a higher priority. Conversely, if not enough issues are being resolved, it means that you need to further analyze metrics such as MTTR and SLA.
{% hint style="info" %}
The Total Open issues metric at the top completes the picture for this trend, by showing the total open issues at the end of the selected period compared with the total open issues at the beginning of the selected date range.
{% endhint %}
Reviewing the **Exposure Window** trend allows you to identify the capacity of security issues that are open within predefined periods. This is a relevant metric to follow when filtering by attributes such as severity, exploit maturity, or asset class. and ensuring that the most critical issues for sensitive assets are being remediated on time.
The **Time to Resolve by Week** trend provides visibility on the number of issues remediated within predefined periods, allowing you to measure remediation performance over time.
The **Risk breakdown** table helps you make data-driven decisions about where you need to focus. The tables allow you to review performance metrics from several angles.
Use the dimension picker to browse:
* **Projects** - Available at the Organization level. Allows you to pinpoint Projects that require your attention.
* **Organizations** - Available at the Group level. Surface Snyk Organizations based on their performance.
* **Asset Classes** - Ensure that efforts are prioritized to secure the most sensitive assets first.
* **Introduction Categories** - Allows to determine if preventable issues are handled properly by looking at the percentage change of new preventable issues, as well as assessing the impact of new monitored assets over your AppSec Program. You can view this under the **Baseline Issue** category.
## Vulnerabilities Detail report
The Vulnerabilities Detail report is similar to the Issues Detail report but shows issues grouped by Snyk Problem ID ([see Snyk Vulnerability DB](https://security.snyk.io/vuln)).
You can easily see how many instances of a vulnerability exist and how many Projects are affected. Use this report to understand which vulnerabilities are most prevalent for both resolution and prevention use cases.
For a table of Total Issues, use Change Reports to switch to the Issues Detail report.
{% hint style="info" %}
**Dependencies and license information**
To view Dependencies and license information, select the **Dependencies** menu option. See [Dependencies and licenses](https://docs.snyk.io/manage-risk/reporting/dependencies-and-licenses) for details.
{% endhint %}
## Featured Zero-Day report
This report addresses primary scenarios for managing and resolving emerging zero-day vulnerabilities, which carry significant consequences and attract substantial attention in the global AppSec community.
Use this report to discover your exposure to issues highlighted in a zero-day publication across various Targets and Projects. The report helps you prioritize zero-day issues and monitor the progress of remediation efforts against any remaining occurrences.
The [Security team at Snyk](https://snyk.io/platform/security-intelligence/) continuously updates the [Vulnerability Database](https://security.snyk.io/) with new vulnerabilities several times a day. When the team discovers a major new zero-day vulnerability—typically in a widely used package with high severity that affects many customers—it will be announced and addressed as a zero-day event.
Upon the announcement of a new zero-day event, begin by examining the **Impacted Targets** table to gain a deeper understanding of the exposure. Use filters such as Project Lifecycle, Environment, or Project Criticality to focus solely on Targets associated with Projects in production that are externally exposed or of high criticality. Gaining such insights depends on the [availability of Project attributes](https://docs.snyk.io/snyk-platform-administration/snyk-projects/project-attributes#available-attributes-and-their-values).
Next, proceed to the **All** **Issues** table and compile a prioritized list of issues requiring remediation. Typically, prioritization is determined by either the Snyk [Risk Score](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/risk-score) or NVD CVSS Score, with emphasis placed on addressing vulnerabilities within sensitive targets. Apply filters based on Project Lifecycle, Environment, or Project Criticality to identify and address these targets promptly.
For continuous monitoring of remediation progress and efficacy, refer to the trend diagrams.\
The **Accumulative Issues Backlog Trend** diagram shows the weekly changes in the zero-day backlog by accumulating the weekly delta between identified and resolved issues. Use this diagram to ensure that your R\&D teams are reducing the zero-day backlog consistently, which will be indicated by a negative trend line.
In parallel, review the **Issues Identified versus Resolved over Time** diagram to conclude whether additional emphasis should be placed on preventing the introduction of new issues or on accelerating the remediation efforts.
## SLA Management report
The report presents a set of default SLA targets per severity based on common security standards, such as FedRAMP. These SLA targets can be modified to meet your own security requirements.
The SLA status of an issue can be:
* **Within SLA** - the age of the issue has not exceeded the SLA target, and it is expected to have sufficient lead time before breaching.
* **At Risk** - the issue is considered to be approaching an SLA breach and is flagged as “At Risk”.
* **Breached** - the age of the issue has exceeded the SLA target.
You can control the SLA targets and the transition of issues to the “At Risk” status by editing the **SLA target** and setting the **At risk duration before breach (days)** field.
SLA Management Report - Edit SLA targets
The SLA report includes additional filters under the SLA category, allowing for better identification of the age of issues in relation to the SLA target:
* **SLA status** - allows the filtering of the report according to a specific SLA status.
* **Issue age** - allows to discover issues within a range of age.
* **Time until breach** - identifies issues that will breach the SLA target within days.
{% hint style="info" %}
The report is, by default, showing only issues that are with high or critical severity. Update the severity filter if you want to view the SLA status for additional severities.
{% endhint %}
SLA Filters within the filters picker
You can share the report with predefined SLA targets by sharing the report URL or return to a predefined SLA report by bookmarking the web page in your browser.
In the **Open issues** section, the **SLA severity breakdown** shows a distribution of severity levels by the SLA compliance status of the viewed Group or Organization.
The **SLA trend** shows the cumulative SLA status of issues over time.
SLA Management Report - Open issues section
The **SLA breakdown table** allows you to compare the SLA compliance results of Organizations in the Group view, or Targets in the Organization view. The table is sorted by default according to the quantity of breached issues.
SLA Management Report - SLA Breakdown
The **Breached and at-risk open issues** table helps you prioritize issues based on their aging and SLA compliance status. You can use the **Modify Column** picker to add additional columns and learn more about the specific issues.
SLA Management Report - Breached and at risk open issues
{% hint style="info" %}
You can download the **SLA Breakdown** and the **Breached and at risk open issues data** in a CSV format using the **Download CSV** option.
{% endhint %}
You can review the SLA results for resolved issues and perform a retrospective analysis by reviewing the **Resolved issues** section.
Resolved issues section
## OWASP Top 10 report
The [OWASP Top 10](https://owasp.org/www-project-top-ten/) is a standard awareness document for developers and web application security. It represents a broad consensus about the most critical security risks for web applications and is globally recognized by developers as the first step towards more secure coding.
Each control in the list (A1, A2, and so on) is based on a list of Common Weakness Enumerations (CWEs). For example, [A01:2021 – Broken Access Control](https://owasp.org/Top10/A01_2021-Broken_Access_Control/) is based on a list of 34 CWEs.
The CWEs are mapped to Snyk-IDs (), which are mapped to issues.
For example, the critical vulnerability [SNYK-JAVA-ORGAPACHELOGGINGLOG4J-2314720](https://security.snyk.io/vuln/SNYK-JAVA-ORGAPACHELOGGINGLOG4J-2314720) is classified as [CWE-94](https://cwe.mitre.org/data/definitions/94.html), which is part of the OWASP TOP 10 [A03:2021 - Injection](https://owasp.org/Top10/A03_2021-Injection/). All the issues related to this vulnerability will be under the A03 category.
Learn more by using the [OWASP TOP 10 Learning path](https://learn.snyk.io/learning-paths/owasp-top-10/) on Snyk Learn.
The report is based on the latest mapping released in 2021. The supported products are Snyk Open Source, Snyk Container, and Snyk Code.
## CWE Top 25 report
The [CWE Top 25](https://cwe.mitre.org/top25/) Most Dangerous Software Weaknesses is a list that demonstrates the current most common and impactful software weaknesses based on Common Vulnerabilities and Exposures (CVEs) severity and their exploitation potential.
The report is based on the latest version released in 2023 by Mitre. The supported products are Snyk Open Source, Snyk Container, and Snyk Code.
## CWE Top 10 KEV report
The [CWE Top 10 KEV Weaknesses](https://cwe.mitre.org/top25/archive/2023/2023_kev_list.html) list identifies the top ten CWEs in the Cybersecurity and Infrastructure Security Agency’s (CISA) [Known Exploited Vulnerabilities](https://www.cisa.gov/known-exploited-vulnerabilities-catalog) (KEV) Catalog, a database of security flaws in software applications and weaknesses that have been exposed and leveraged by attackers.
The report is based on the version released in 2023 by Mitre. The supported products are Snyk Open Source, Snyk Container, and Snyk Code.
## PCI-DSS v4.0.1 report
{% hint style="info" %}
**Release status**
The PCI-DSS v4.0.1 report is in Early Access and available only with Enterprise plans.
{% endhint %}
PCI Security Standards are technical and operational requirements created by the PCI Security Standards Council (PCI SSC) to safeguard cardholder data. These standards apply to all entities that store, process, or transmit this information and include requirements for software developers and manufacturers.\
\
The Council manages these standards, while compliance is enforced by founding members: American Express, Discover Financial Services, JCB, MasterCard, and Visa Inc.
Snyk PCI-DSS v4.0.1 Report is designed to help you:
* Estimate readiness for meeting the PCI-DSS AppSec requirements for SCA and SAST based on the Snyk scan results.
* Provide evidence that the Organization is meeting the PCI-DSS AppSec requirements for SCA and SAST vulnerabilities.
* Prioritize issues to improve PCI-DSS compliance readiness.
Snyk PCI-DSS v4.0.1 Report
The report identifies PCI-DSS risks and violations based on the following PCI-DSS v4.0.1 requirements:
1. **Requirement 6.2.4:** Engineers use various techniques to prevent or mitigate common software attacks and related vulnerabilities in bespoke and custom software. This includes but is not limited to the following methods:
* Injection attacks, including SQL, LDAP, XPath, or other command, parameter, object, fault, or injection-type flaws.
* Attacks on data and data structures, including attempts to manipulate buffers, pointers, input data, or shared data.
* Attacks on cryptography usage, including attempts to exploit weak, insecure, or inappropriate cryptographic implementations, algorithms, cipher suites, or modes of operation.
* Attacks on business logic, including attempts to abuse or bypass application features and functionalities through the manipulation of APIs, communication protocols and channels, client-side functionality, or other system or application functions and resources. This includes cross-site scripting (XSS) and cross-site request forgery (CSRF).
* Attacks on access control mechanisms, including attempts to bypass or abuse identification, authentication, or authorization mechanisms or attempts to exploit weaknesses in the implementation of such mechanisms.
* Attacks using any “high-risk” vulnerabilities identified in the vulnerability identification process, as defined in Requirement 6.3.1.
2. **Requirement 6.3.3:** All system components are protected from known vulnerabilities by installing applicable security patches and updates as follows:
* Patches and updates for critical vulnerabilities, identified according to the risk ranking process at Requirement 6.3.1 are installed within one month of release.
### Snyk Violation Analysis based on PCI-DSS attack categories
As the standard does not explicitly define specific CWEs or CVEs, Snyk provides an analysis based on leading CWEs associated with the named attack categories. Below are the CWEs categorized by attack type:
#### Injection Attack Violations Summary
The following list provides an association between the identified attack categories and the CWEs associated with each category:
* SQL Injection: CWE-89
* LDAP Injection: CWE-90
* XML Injection (XPath Injection): CWE-91
* Command Injection: CWE-77
* Use of Unsafe Reflection: CWE-470
#### Attacks on Data and Data Structures Violations Summary
The following list provides an association between the identified attack categories and the CWEs associated with each category:
* Buffer Overflow: CWE-120
* NULL Pointer Dereference: CWE-476
* Double Free: CWE-415
* Concurrent Execution using Shared Resource with Improper Synchronization (‘Race Condition’): CWE-362
#### Attacks on Cryptography Usage Violations Summary
The following list provides an association between the identified attack categories and the CWEs associated with each category:
* Use of a Broken or Risky Cryptographic Algorithm: CWE-327
* Use of Insufficiently Random Values: CWE-330
* Improper Verification of Cryptographic Signature: CWE-347
* Cleartext Transmission of Sensitive Information: CWE-319
* Use of Hard-coded Cryptographic Key: CWE-321
#### Attacks on Business Logic Violations Summary
The following list provides an association between the identified attack categories and the CWEs associated with each category:
* Server-Side Request Forgery (SSRF): CWE-918
* Cross-Site Request Forgery (CSRF): CWE-352
* Cross-Site Scripting (XSS): CWE-79
* Origin Validation Error: CWE-346
* Improper Authorization: CWE-285
* Exposure of Sensitive Information to an Unauthorized Actor: CWE-200
#### Attacks on Access Control Mechanisms Violations Summary
The following list provides an association between the identified attack categories and the CWEs associated with each category:
* Improper Authentication: CWE-287
* Improper Access Control: CWE-284
* Incorrect Authorization: CWE-863
* Authorization Bypass Through User-Controlled Key: CWE-639
* Missing Authentication for Critical Function: CWE-306
* Incorrect Implementation of Authentication Algorithm: CWE-303
#### Attacks on Access Control Mechanisms Violations Summary
The Missing Authorization attack category is associated with CWE-862.
### PCI-DSS v4.0.1 Guidance
The report is filtered by default on open issues of critical severity. Those filters are also applicable when exporting the report to PDF.
#### PCI-DSS Readiness Trend
The PCI-DSS Readiness Trend is designed to help you track your progress toward eliminating PCI-DSS violations. A violation is defined as a critical vulnerability elected by the PCI-DSS attack categories (as explained in Requirement 6.2.4) that is more than 30 days old, as stated in Requirement 6.3.3.
The number on the left indicates the violation status and the progress made in the last seven days.
The trend shows all vulnerabilities per Requirement 6.2.4, categorized by age bucket. This allows for quick identification of potential violations and vulnerabilities that may soon become violations.
PCI-DSS Readiness Trend
#### Attack category breakdown
The breakdown table helps identify the number of vulnerabilities by attack category (as per requirement 6.2.4) or by Snyk Organization based on the relevant age bucket.
Use the table to pinpoint major attack categories or Snyk Organizations that lead to PCI-DSS violations. You can click on the figures to explore the specific issues in more detail.
{% hint style="info" %}
After you investigate and see the actual issues behind the figures, you may proceed by:
* Vulnerability triage and prioritization.
* Conclude the prevalent CWEs and CVEs by sorting on the CWE/CVE column and filtering those CWEs/CVEs in the [Vulnerabilities Detail Report](#vulnerabilities-detail-report) to surface all the vulnerability occurrences across targets and Projects.
* Run a vulnerability eradication campaign or assign Snyk Learn training to relevant engineering teams.
{% endhint %}
Attack Category Breakdown
## Developer IDE and CLI usage
To use this report, you must ensure you have installed the following prerequisites:
* Snyk CLI
* version 1.1292.1 or newer (for CLI and IDE plugins usage)
* version 1.1297.0 or newer for general Agentic scans (Snyk Studio using MCP)
* version 1.1298.1 or newer for granular Agentic scans (such as MCP host)
* VS Code 1.86.0 or newer and Snyk Security plugin 2.3.3 or newer
* IntelliJ IDEs 2023.3 or newer and Snyk Security plugin 2.7.3 or newer
* Visual Studio 2019, 2022 and Snyk Security Plugin 1.1.47 or newer
* Eclipse 2023.12 or newer and Snyk Security plugin 2.1.0 or newer
This report shows the adoption of Snyk testing in local development through the IDE plugins, using the CLI locally or incorporating Snyk Studio into agentic workflows. The report is available under the Change Report dropdown at the Group and Organization levels.
{% hint style="info" %}
This report focuses on the local developer experience and does not include the use of CI/CD. In addition, it does not show Organizations or developers that have never used the CLI, IDE, or Snyk Studio (via MCP).
{% endhint %}
Security teams can use this report to demonstrate strong shift-left behavior as a model behavior to bring to other teams. This report also shows where teams or individual developers are not adopting Snyk locally. Companies can use this report to encourage more shift-left behavior.
This report shows the test usage in the IDE, CLI, and Snyk Studio by developers. Teams can filter by date and Organization. The report includes visibility into metrics such as:
#### Total number of developers running scans and the number of scans in IDE, CLI, and Agentic integrations (Snyk Studio)
#### Charts and summary tables breaking down this data by the environment of the scan
#### Charts and summary tables breaking down this data by different dimensions, such as IDE plugins or Agentic integrations
#### Charts and summary tables breaking down this data by the Snyk scan type
#### List of organizations and developers adopting Snyk locally
## Repositories tested in CI/CD report
To use this report, consider the following prerequisites:
* Snyk CLI version 1.1292.1 or newer.
* Viewing the last commit data requires SCM Group integration. For more details, navigate to [SCM integrations](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations).
* When testing containers, include the `.git` context as part of the `snyk container test` command.
This report analyzes Snyk tests performed as part of CI/CD pipelines executed using Snyk CLI. It will inform you about the usage of your company and adoption of testing in CI/CD, ensuring repositories are tested as expected and preventing critical vulnerabilities and misconfigurations from being deployed and reaching the production environment.
{% hint style="info" %}
* The report results are scoped by a date range filter that you can use to review specific periods. The filter is defaulted to the last 30 days.
* This report provides visibility into Snyk tests (`snyk test`, `snyk code test`, `snyk container test`, `snyk iac test`) executed within your CI pipeline (using CLI). Its primary goal is to help you evaluate test results and determine whether to pass or fail the build process based on these security checks.
* Please note that `snyk monitor` commands are **not** included in this report. While `snyk monitor` is crucial for ongoing security posture and identifying new vulnerabilities, this report specifically tracks tests that actively gate your CI/CD pipeline.
{% endhint %}
The numbers displayed on the main view of the report represent the number of repositories tested in the selected date range per Snyk product.
In addition, you can learn about the change in the number of tested repositories compared to the previous sequential period, so you can conclude whether the adoption of CI/CD tests across repositories improved.
A green upward arrow indicates that more repositories were tested compared to the previous sequential period, while a red downward arrow indicates the opposite. The absolute change value appears next to the arrow, and the perception of change appears right underneath to measure the degree of change.
Repositories tested during date range
{% hint style="info" %}
A sequential period refers to a date range covering the last seven days. In this case, the period starts seven days ago and ends today. The previous sequential period spans from 14 days ago to seven days ago. As a result, both sequential periods are of the same duration.
{% endhint %}
#### Repository Test Adoption
Review the Repository Test Adoption trend to learn more about adoption over time.\
Represented by the green line, you can see the weekly number of repositories that have been tested compared to the repositories that had commits in the last 30 days, represented by the purple line.
This comparison helps determine whether Snyk tests in CI/CD are being increasingly adopted over time and highlights the number of repositories that have received commits but have not been tested in CI/CD.
{% hint style="info" %}
Viewing the last commit data requires SCM Group integration. For more details, navigate to the [SCM integrations](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations) page.
{% endhint %}
You can filter by specific products or by specific organizations or extend the viewed period using the date range filter.
Repository Test Adoption
#### Test Success Rate Trend
The test success rate serves as an indicator of how well the engineering department or specific Snyk Organizations can adopt a "shift left" approach, which aims to identify and resolve issues before the code reaches the build process. This success rate is calculated by dividing the number of tests that passed by the total number of relevant tests conducted.
{% hint style="info" %}
An applicable test is a test that did not fail due to technical issues or a non-supported Project.
{% endhint %}
Having a low success rate can indicate that:
* Snyk tests are failing due to security issues that can be prevented in local development or in the PR Check stages. Snyk recommends testing with the [Snyk IDE](https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions) plugin, using [Snyk PR Checks](https://docs.snyk.io/scan-with-snyk/pull-requests/pull-request-checks) and enroll in a [Snyk Learn](https://docs.snyk.io/discover-snyk/snyk-learn) program.
* The test success criteria are too strict. To explore this option further, Snyk recommends reviewing the test definitions of the organizations with the lowest success rate, as shown by the Adoption by Organizations widget. For more details about defining test success criteria, navigate to the [Failing of builds in Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/failing-of-builds-in-snyk-cli) page.
Test Success Rate Trend
#### Adoption by Organizations
Launching an Application Security program to boost testing adoption in CI/CD pipelines can be challenging. This initiative requires collaboration between the AppSec and R\&D teams and will be implemented gradually, with regular progress monitoring.
The Adoption by Organization table facilitates tracking and comparing the adoption rates of Snyk Organizations, helping you identify the organizations that are struggling or lagging behind.
In addition, you can examine the success rate column to surface organizations that have lower success rates.
**Columns descriptions:**
* **Tested Repositories:** the number of repositories that were tested in the selected time range, with an indication of the percentage of change compared to the previous sequential period.
* **Committed Repositories:** the number of repositories that had any commits in the last 30 days at any given time within the selected time range, with an indication of the percentage of change compared to the previous sequential period.
* **Success Rate:** the portion of successful tests in CI/CD against all other tests that were executed.
#### Repository Test Summary
The repository test summary table shows the performed tests during the selected date range.
The default sorting in the table surfaces repositories according to their last commit, allowing you to identify repositories that were expected to be tested in CI/CD pipelines and verify they were tested. Clicking the column names to sort the table according to the selected column. You can sort the table by multiple columns at a time.
{% hint style="info" %}
Viewing the last commit data requires SCM Group integration. For more details, navigate to the [Group-level integrations](https://docs.snyk.io/developer-tools/scm-integrations/group-level-integrations) page.
{% endhint %}
You can execute the test on a specific repository branch in the table. The `tested` indicator means that any branch of this repository was tested during the selected date range.
{% hint style="info" %}
Hovering over the TESTED tag reveals the last test performed during the selected date range
{% endhint %}
Repository Test Summary
## Cloud Compliance Issues report
{% hint style="info" %}
This report is available only if you have enabled legacy Snyk Cloud.
{% endhint %}
The Cloud Compliance Issues report shows cloud issues for an entire Organization, organized by [compliance standard](https://docs.snyk.io/scan-with-snyk/snyk-iac/getting-started-with-cloud-scans/key-concepts-for-cloud-scans#docs-internal-guid-e2e38027-7fff-9271-f2c0-e23677542f6e).
You can view a report for a single version of a compliance standard at a time, for example, CIS AWS Foundations Benchmark v1.4.0, by selecting the desired standard from the dropdown menu. Each report includes a list of compliance controls organized by control category, with corresponding issue counts.
Selecting an issue count lets you view the list of issues associated with that control in the [Cloud Issues UI](https://docs.snyk.io/scan-with-snyk/snyk-iac/getting-started-with-cloud-scans/manage-cloud-issues/view-cloud-issues-in-the-snyk-web-ui), where you can view each issue in detail.
Use the information in the Cloud Compliance Issues report to investigate, triage, and fix cloud compliance issues.
## Learn Engagement
{% hint style="info" %}
Learn Engagement report is available only in the Learning Management add-on offering. For more information, contact your Snyk account team.
{% endhint %}
The goal of the engagement report is to provide insights into your security education and training programs overall progress, and give you insights into which parts of your Organization are engaging with Snyk Learn content. You can use the data and insights to better optimise your program, find security champions, generate reports for compliance and show progress to your executive sponsors.
### Access the report
The Learn Engagement report can be accessed at the Group level from the **Change Report** dropdown in the Reports menu.
### Report features
The report allows you to track:
* Learn engagement snapshot analytics
* Assignment Progress
* Adoption rankings
* Content usage breakdown
* Filtering: custom time periods, users, organizations, organization role, and Lesson titles.
### Learn engagement snapshot and assignment progress
The first section of the report focuses on showing key engagement statistics and the progress of any assignments. Tool tips provide more details on the definitions of the metrics.
### Adoption rankings
The adoption ranking section shows your organization and individual user engagement with Snyk Learn. This is ranked by "Lessons complete" and also has the estimated duration the org/user has spent on Snyk Learn lessons. Estimated duration calculated using the estimated duration presented at the start of each lesson, and includes estimated time from any progress on "in-progress" lessons in the selected period.
{% hint style="info" %}
The user level adoption ranking is a great way to identify potential security champions who are proactively engaging in security education and training.
{% endhint %}
### Learning breakdown
The breakdown shows the different types of Learn content the users are engaging with, using lesson completions as the measure. You can see if users are engaging with product training or security education, along with the most popular lessons and insights into which CWE categories users are studying the most.
## Learning Impact & Opportunities
{% hint style="info" %}
Learning Impact & Opportunities report is in Early Access and available only in the Learning Management add-on offering. For more information, contact your Snyk account team.
{% endhint %}
The goal of the impact and opportunities report is to provide insights into the impact your security education and training programs are having on code issue remediation and code issue prevention. In addition, the report gives recommendations for future training based on your code issue backlog, and issues that were introduced during the selected time period of the report.
### Access the report
The Learning Impact & Opportunities report can be accessed at the Group level from the **Change Report** dropdown in the Reports menu.
### Report features
The report allows you to track:
* Impact of education and training on code issue remediation
* Impact of education and training on code issue prevention
* Recommendations for further training opportunities
* Coverage rates of users trained in identified training opportunities.
* Filtering: custom time periods, users, organizations, lesson title, CWE, issue severity.
### Learning impact snapshot
The first section of the report focuses on the impact education is having on your security program, focusing on code issue resolution and code issue prevention.
The "Learning Impact on Issue Resolution" chart measures the relationship between lesson completion and the resolution of detected code security issues. Resolved issues are counted when a related lesson was completed before the issue was fixed within the selected period. Lesson completions are counted when a related issue was fixed after the lesson was completed within the selected period. Use the filters to drill into specific lessons or CWE categories.
The "Learning Impact on Issue Prevention" chart measures the relationship between lesson completion and the prevention code security issues. Introduced issues are counted when a related lesson was completed within the selected period. Issues introduced on the day a Project was imported are not counted. Use the filters to drill into specific lessons or CWE categories.
### Top 10 CWEs - open issues / issues introduced in the period
This section of the report shows recommendation for training for your top open code issues, and most frequently introduced issues, by volume. Note issues are only included when Snyk Learn has a related lesson for the CWE category.
You will see coverage for all users within organisation scope of the report filters. This shows you how many people have ever completed a related Snyk Learn lesson on the topic.
{% hint style="info" %}
The recommendations in this section allow you to focus on the most impactful training opportunities. Use the filters to further customise the recommendations based on issue severity or for specific Organizations.
{% endhint %}
## Snyk Generated Pull Requests
{% hint style="info" %}
**Feature availability**
Snyk Generated Pull Requests report is available only for Enterprise plan customers, for all SCM integrations. For more information, see [Plans and pricing](https://snyk.io/plans/).
{% endhint %}
### Access the report
The Generated Pull Requests report can be accessed at both Group and Organization level from the **Change Report** drop down in the Reports menu.
Snyk generated pull requests report
This report type provides an overview of how [Fix](https://docs.snyk.io/scan-with-snyk/pull-requests/snyk-pull-or-merge-requests/create-automatic-prs-for-new-fixes-fix-prs), [Backlog](https://docs.snyk.io/scan-with-snyk/pull-requests/snyk-pull-or-merge-requests/create-automatic-prs-for-backlog-issues-and-known-vulnerabilities-backlog-prs), and [Upgrade PRs](https://docs.snyk.io/scan-with-snyk/pull-requests/snyk-pull-or-merge-requests/upgrade-dependencies-with-automatic-prs-upgrade-prs) are used and highlights the efficiency of PR merges.
The analytics report covers the following:
* Overview of PRs status by type and the PR merge ratio.
* Visibility of issues.
* Breakdown by repository for PR status.
The report summary enables you to check the total number of Snyk PRs created, the total pull requests merged, and the mean time to merge for those pull requests.
{% hint style="warning" %}
This report type does not include PR checks.
{% endhint %}
### Report features
Use the date filter in the upper right corner of the report to display data based on a specific interval.
Add various filters to narrow down results to specific configurations. The filter options are Organization, SCM, Project, and Repository.
#### Snyk Generated Pull Requests usage
Pull Request usage graph and table
Pull Request usage is visualized in a **Pull requests by type** graph and a **Pull requests by status** table, displaying the same data in different formats. These distinguish the number of PRs into Fix, Backlog, and Dependency upgrade categories, segmented by Open, Merged, and Closed status types. Merge rate is presented as a percentage for each row.
#### Open vs Fixed issues
Open vs Fixed issues graph and table
The Open vs Fixed issues in Snyk PRs graph and table displays the number of open and fixed issues based on severity.
#### Snyk Generated Pull Requests by repository
Projects/Orgs/Repository table for PRs of different status
The **Projects/Org/Repository** table displays the number of Total, Open, Merged, and Closed PRs for each Organization and repository relationship. Merge rate is presented as a percentage for each row.
Select a repository name to open a modal containing additional metrics for that specific repository.
Repository breakdown by PR type and PR status
The repository breakdown details the number of PRs segmented by PR type and PR status. Merge rate is presented as a percentage for each row. It also lists the Projects within that repository, with the number of issues categorised by severity.
## Asset Dashboard
The Asset Dashboard provides a comprehensive overview of your application and security controls. It displays essential data such as the status and trends of open issues, control coverage, and repository metadata.
The Asset Dashboard is a central hub for managing and reviewing assets, making tracking inventory size easier over time and understanding the interaction between different asset types.
While Snyk Inventory enables the discovery and management of your assets that should be secured, the Snyk Asset Dashboard allows you to go beyond the details and better understand the main building blocks of your inventory.\
\
The Asset Dashboard brings all the asset data that is available in your inventory and helps to answer various questions, such as:
* Does my AppSec program meet the coverage requirements for business-critical assets and strategic applications?
* Are the assets being classified properly according to their criticality?
* Do you know which repositories belong to which application or code owners? Are newly introduced repositories being updated with that data?
* What are the main programming languages and package managers that are used in repositories that have been worked on recently?
### Filters
The filters are located at the top left of the page, with the following filtering options: **Asset Class**, **Asset type,** **Add filter**. The filter selection applies to all available data widgets.
Here are the available filters:
| Filter | Description |
| -------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Asset Class | The business criticality of an asset (A - most critical to D - least critical). |
| Asset type | The type of an asset (Container image, Package, Repository). Most data widgets already present certain asset types by default. |
| \*Application | The list of the applications for which you have configured the application context catalog in Snyk Essentials. |
| \*Catalog name | The name of your application context catalog. |
| \*Category | The category of a repository asset. For example, `service` or `library`. |
| Discovered | The period when the asset was discovered. |
| Last Seen | The period when the asset was last imported from the integration. |
| \*Lifecycle | The lifecycle state of the application context catalog component. For example `production`, `experimental`, `deprecated`. |
| \*Owner | The team that owns the repository for which the application context catalog was configured. |
| Repository Freshness | The last commit date in the repository:
Active : Had commits in the last 3 months.Inactive : The last commits were made in the last 3 - 6 months.Dormant : No commits in the last 6 months.N/A : There are no commits detected by Snyk Essentials. |
| Source | The integration that imported the asset. |
| Tags | The asset tags. For more details about tagging assets using a policy, see the [Tagging policy](https://docs.snyk.io/manage-risk/policies/assets-policies/use-cases-for-policies/tagging-policy) page. |
| \*Title | The name of the component for which the application context catalog was configured. |
**\***All filters marked with `*` are visible only if you configured the [application context](https://docs.snyk.io/developer-tools/scm-integrations/application-context-for-scm-integrations) catalog for your SCM integrations.
### Repository coverage widget
The repository coverage widget provides an overview of the percentage of scanned repositories compared to the total number of available repositories, using integrated Snyk or third-party security products.
Hover over any column to see how the coverage percentage is calculated.
Repository Coverage
### Asset class breakdown
The asset class breakdown widget surfaces the distribution of repositories and container images by [asset class](https://docs.snyk.io/manage-assets/assets-inventory-components#class). Reviewing this widget allows you to determine the percentage of business-critical assets in your inventory and drill down to see the actual assets.
{% hint style="info" %}
**Tips**
* Having the context of the asset class is crucial for prioritizing assets. It is recommended to categorize your inventory by implementing [classification policies](https://docs.snyk.io/manage-risk/policies/assets-policies/use-cases-for-policies/classification-policy) to proactively classify existing and newly introduced assets.
* Using the filters enables narrowing down the asset class distribution within specific applications or code owners, as well as focusing on active repositories or a set of assets based on the asset tags.
{% endhint %}
Asset Class Breakdown
### Top 10 technologies breakdown
The top 10 technologies widget identifies the leading programming languages and frameworks used in repositories. Using the available filters enables you to determine the most commonly used technologies in active or business-critical repositories. Moreover, you can investigate specific applications or code owners.
{% hint style="info" %}
**Tips**
* The technology data is available in the [asset tags](https://docs.snyk.io/manage-assets/assets-inventory-components#tags).
* Click a presented technology to open the inventory page in a new browser tab. This will allow you to review the related repositories in detail.
{% endhint %}
### Top 10 package managers breakdown
The top 10 package managers widget allows you to identify the leading package managers in your inventory. The quantities represent assets of package type. A [package asset](https://docs.snyk.io/manage-assets/assets-inventory-layouts#packages) is defined as software or library that is managed by package management systems.
### Repository freshness
The repository freshness widget displays the distribution of repositories according to the last commit date:
* **Active**: Had commits in the last 3 months.
* **Inactive**: The last commits were made in the last 3 - 6 months.
* **Dormant**: No commits in the last 6 months.
* **N/A**: Commits data is unavailable.
You can use this widget to surface the quantity of repositories that are more or less maintained in various contexts, such as specific applications.
{% hint style="info" %}
**Tips**
You can use the asset class filter to identify business-critical assets that are not being maintained. Click a specific slice to open the inventory page in a new browser tab where you can browse and learn more about those assets.
{% endhint %}
Repository freshness
### Application context availability
The application context availability widget allows you to discover gaps in the context of assets. The available columns include:
* **Application Context** - displays the analyzed context attribute.
* **Unique Values** - shows how many unique instances exist for an attribute. For example, you can check how many unique applications or code owners are available for any of the listed attributes.
* **Availability in Repos** - indicates the completeness of a certain attribute across the repositories.
{% hint style="info" %}
**Tips**
* Before reviewing this widget, ensure that the results are cleaned up by filtering out the "dummy" attribute values, such as "unknown", "-", and so on.\
You can clean up the values by selecting only the relevant values.
* Filtering by asset class allows you to identify business-critical repositories without a known code owner or associated application.
* Filtering by the "active" value of the repository freshness filter allows you to discover context gaps in repositories that are actively being developed.
* Reviewing the unique values allows you to spot gaps in context. For example, you may realize that the number of unique code owners does not match the number of teams.
{% endhint %}
Application Context Availability
### Asset source breakdown
The asset source breakdown widget visualizes the quantities of detected assets from various sources. A source can be a platform where the asset is being managed directly (such as an SCM, container registry, and so on) or a platform that enriches the assets (such as security products and ASTs).
{% hint style="info" %}
**Tips**
* The widget displays the net quantities of detected assets for each source. If an asset is detected in more than one source, it will be counted once for each detected source.
* When asset inventory quantities seem incomplete or exceed expectations, this widget will help you discover which integrations should be examined and potentially configured differently.
{% endhint %}
Asset source breakdown
## Risk exposure report
This report gives you a single, consolidated view of your security risks. It allows you to quickly understand your risk exposure, track your progress in reducing it, and pinpoint high-risk areas.
The Risk Exposure Report helps AppSec teams make quicker, more informed decisions. Rather than reviewing multiple reports, it provides a clear overview of the security landscape, allowing you to:
* Make faster decisions by quickly identifying your biggest security challenges and where to focus your attention.
* Prioritize effectively by using data to guide your mitigation efforts toward the areas that contribute the most risk.
* Show progress by tracking the impact of your team on reducing risk over time with easy-to-understand visualizations.
### Severity source

Choose your preferred severity source and automatically update selected severity throughout the report:
* **Snyk**: utilizing Snyk proprietary CVSS calculations and other factors, including the relative importance of the Linux distributor.
* **NVD CVSS**: leveraging severity scores from the National Vulnerability Database (NVD).
* **Non-SCA Severities:** For non-SCA issues (e.g., Code, IaC), Snyk's severity calculates High, Medium, and Low levels for specific code vulnerabilities and makes use of the Common Configuration Scoring System (CCSS) for IaC severity determinations
The report includes two main sections to provide a comprehensive view of your risk landscape:
### **Risk exposure Trends**
This section provides a visual overview of your issues over time. You can view these trends by:
* **Severity**: See the distribution of issues across different severity levels.
* **Introduction Category**: Understand how issues are being introduced into your Projects.
* **Asset Class**: Group issues by the type of asset they affect.
### **Risk exposure Breakdown**
This detailed table breaks down issues and impacted assets. You can dynamically group the data to fit your needs by selecting from the following dimensions: group, organization, project, introduction category, and asset class.
The table is sorted by default to surface the total number of critical and high-severity issues, helping you focus on the most urgent risks first.
You can also export data to PDF or CSV and drill down into issues for more detail.
## Saved Views
The Saved Views feature enables collaboration based on shared, consistent, and customizable reports. This feature is available at Organization and Group level, in the **Reports** menu. It allows you to customize and save filter settings for your reports, which you can then reuse.
To make it easier to share the view outside of the Snyk platform, the URL of a saved view remains the same after it's created, regardless of any changes you make to it.
### Prerequisites
To create, edit, and remove a saved view, you must have **Edit reports** permission. Saved views are not private. After being created, Saved Views are visible to all users with **View reports** permission. Only Organization and Group Admins can assign these permissions. For more information, see [User role management](https://docs.snyk.io/snyk-platform-administration/user-roles/user-role-management).
To assign report permissions:
1. In the Snyk Web UI, navigate to your Group and Organization.
2. At the Group level, navigate to **Members** > **Manage Roles** > **Group Admin** and enable the following permissions:
* **View reports:** to view Snyk Reports and to view the saved views that were created by others
* **Edit reports:** to create saved views.
3. At the Organization level, enable the **View Organization reports** and **Edit Organization reports** permissions.
### Create a view
To create a new view:
1. In the Snyk Web UI, navigate to your Group or Organization.
2. Navigate to the **Reports** menu and select a report from the **Change Report** dropdown.
3. Select the **Standard view** filter and click **Create new view**.
Create new view button in the Standard view filter
4. Fill in the name of the view and click **Create view**.
{% hint style="info" %}
The name of a saved view can contain a maximum of 60 characters and must be unique from other saved views for the same report.
{% endhint %}
### Update a view
To update a saved view:
1. In the Snyk Web UI, navigate to your Group and Organization.
2. Navigate to the **Reports** menu and select the report that contains the saved view you want to update.
3. From the **Standard view** filter, select and load the view you want to update.
4. Make any necessary changes to the report view.
5. Save the changes by clicking **Save** next to the Saved Views dropdown. This overwrites the existing view.
### Rename, delete, or copy the URL of a view
If you hover over the name of a saved view and click the three dots that appear, the following options are available:CommentShare feedback on the editor
* **Copy URL**: to copy the URL of the saved view
* **Set as Group default view:** to set a view as default for your Group. You can then remove it as the default by clicking **Remove as Group default view**.
* **Rename:** to rename the saved view
* **Delete**: to delete the saved view.
Options available for saved views
### Example
Snyk offers built-in reports that you can customize based on associated report filters. A wide range of filters are available, some with multiple values. In such cases, you can save the state of the report using a saved view.
For example, your **Issues Detail** report shows a large number of issues and thus is difficult to manage.
An Issues Detail report
You can add a **Computed Fixability** filter to your report to show only issues that are computed as **Fixable.**
Finally, you can add an **Exploit Maturity** filter to show only issues with a specific risk score.
Filtered Issues Detail report
You can then save this filtered view by clicking **Save**, adding a name to your view, and then clicking **Create view**.
Create view window
You can then share the report with your development team by copying and sending the view URL.
## PR Check Report
{% hint style="info" %}
**Release status**
The PR Check report is in Early Access and available only with Enterprise plans.
{% endhint %}
This report combines adoption, performance, and reliability into a single view of PR scanning health, and it provides visibility into the adoption and performance of PR scanning across your different repositories.
You can use this report to determine where Snyk PR checks are implemented, where adoption could be increased, and which types of failures most frequently impact developer workflows. Together, these insights guide teams towards better PR scan coverage, more stable runs, and an improved developer experience across your whole Organization.
Filters and CSV downloads allow users to focus on specific groups or configurations for deeper analysis.
### PR check performance and status
You can visualize PR check performance through trend charts and summary tiles that display the successful, failed, and errored checks.
The charts break down performance across time and PR Check status. A secondary view distinguishes between Snyk Code and Snyk Open Source, allowing you to understand which product area contributes most to success or error patterns.
Use this report as an overview of platform reliability and scanning performance.
PR check performance and status report
### Error PR checks by error message
This view allows you to see the most common causes of PR check errors. This helps teams identify recurring issues and prioritize fixes that improve developer experience. The report highlights technical limitations or setup problems for quick resolution.
Lst of error PR checks displayed by error message
### PR scanning adoption
This view highlights PR scanning adoption across teams. You can view adoption by Group, Organization, or repository by using the dropdown controls.
Each view displays the following key indicators:
* SCM integrations with PR scanning
* Repositories using Snyk Code
* Repositories using Snyk Open Source
The layout displays where PR scanning is active and where further enablement could extend coverage across repositories. The long-term goal is to achieve full coverage, ensuring that 100% of repositories and integrations are included in PR scanning.
PR scanning adoption view by Group
### PR scanning performance
This view displays how PR activity across different Groups, Organizations, or repositories translates into scan coverage and failure rates.
You can go into detail by using the dropdown menu. The table summarizes total PRs and those with failed checks, and highlights where scanning activity is concentrated and where security risks can appear more frequently.
It is common that teams or repositories with higher PR volumes generate more checks. This view helps to surface patterns in failure distribution.
PR scanning performance view by Group
---
# Source: https://docs.snyk.io/snyk-api/using-specific-snyk-apis/webhooks-apis/guides-to-webhooks/how-to-use-snyk-webhooks-to-connect-snyk-to-slack-with-aws-lambda/aws-lambda-setup-set-up-the-trigger/with-api-gateway/aws-api-gateway-add-the-post-method-to-connect-snyk-to-slack.md
# AWS API Gateway: add the POST method to connect Snyk to Slack
The payload Slack will receive will have a message, so create a POST method that will receive the message, verify it is a valid message, and then send on to the AWS Lambda function.
Follow these steps to add the POST Method:
1. Navigate to the AWS API Gateway you have created.
2. Click **Resources**.
3. To create the method, navigate to **Actions** -> **Create Method** -> **Post**.
4. Configure the AWS API Gateway to work with the Lambda function you created by adding the Gateway in the adjacent Lambda function box:\
Choose the **Lambda Function Integration type**.\
Select **Default Timeout**.
AWS Lambda function box
5. In the **Resources**, lick the new **POST** method.
6. Click **Integration Request** (top right on the AWS Gateway POST method execution screen).
AWS Gateway POST method execution
7. Scroll to the bottom and add a **Mapping Template** with **application/json Content-Type**. To the template add the following code:\
`{`\
`"method": "$context.httpMethod",`\
`"body" : $input.json('$'),`\
`"headers": {`\
`#foreach($param in $input.params().header.keySet())`\
`"$param": "$util.escapeJavaScript($input.params().header.get($param))"`\
`#if($foreach.hasNext),#end #end`\
`}`\
`}`
8. Verify the resulting display reflects these entries.
AWS API Gateway POST request Mapping Template with code
---
# Source: https://docs.snyk.io/snyk-api/using-specific-snyk-apis/webhooks-apis/guides-to-webhooks/how-to-use-snyk-webhooks-to-connect-snyk-to-slack-with-aws-lambda/aws-lambda-setup-set-up-the-trigger/with-api-gateway/aws-api-gateway-deploy-the-post-method.md
# AWS API Gateway: deploy the POST method
Deploy with configured POST method so the AWS Lambda function can start receiving the information.
Follow these steps to deploy the POST method:
1. Go to the **Resources** tab.
2. Click **POST**.
3. On the **Actions** tab, click **Deploy API**.
AWS API Gateway POST method Resources, Action tab with Deploy API selected
4. Select the **Deployment stage** to which you want to deploy the new API, in this case, the **default** stage.
AWS Gateway Deploy API to default stage
5. Navigate back to your Lambda function and In the Lambda trigger configuration, verify you see a new API endpoint.
6. Copy the API endpoint from the API Gateway boxes (obscured) for use in setting up the Snyk webhook.
AWS Lambda function trigger configuration showing new endpoint
7. Now that the API endpoint is saved, set up the Snyk Webhook.
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/snyk-cli-for-iac/test-your-iac-files/aws-cdk-files.md
# AWS CDK files
With Snyk Infrastructure as Code, you can test your configuration files with the CLI. You can scan the [Amazon Web Services Cloud Development Kit (AWS CDK)](https://aws.amazon.com/cdk/) with the Snyk CLI by generating a CloudFormation file using the CDK CLI.
Follow these steps to scan a CDK application:
**Navigate** to the directory that contains the application stack for which you want to generate the CloudFormation.
**Generate** the CloudFormation file.
```
cdk synth
```
This is displayed on your terminal as YAML output, and a JSON file is created in the `cdk.out` directory
**Scan** the JSON file using the following Snyk IaC CLI command, replacing `cdk.out/*.json` with the name of the application that you want to scan.
```
snyk iac test cdk.out/*.json
```
---
# Source: https://docs.snyk.io/integrations/event-forwarding/aws-cloudtrail-lake.md
# AWS CloudTrail Lake
{% hint style="info" %}
**Feature availability**\
The AWS CloudTrail Lake integration is available only with Snyk Enterprise plans. For more information, see [plans and pricing](https://snyk.io/plans/).
{% endhint %}
The AWS CloudTrail Lake integration allows you to forward [Snyk audit logs](https://docs.snyk.io/snyk-platform-administration/user-management-with-the-api/retrieve-audit-logs-of-user-initiated-activity-by-api-for-an-org-or-group) to AWS CloudTrail Lake, which lets you run SQL-based queries on your logs and retain them for up to seven (7) years.
This integration can be configured to forward audit logs for a single Snyk Organization, or for a Snyk Group and all of its child Organizations. In either case, there are two steps required to set up the integration:
1. Add a Snyk integration in AWS CloudTrail Lake.
2. Configure the integration in Snyk.
{% hint style="info" %}
This integration sends logs beginning when you enable it. Logs generated before enabling the integration are not sent but may be available from the API endpoint [Search Organization audit logs](https://docs.snyk.io/snyk-api/reference/audit-logs#orgs-org_id-audit_logs-search).
{% endhint %}
## Group-level versus Organization-level audit logs
Audit logs are captured when Snyk users perform actions on the Snyk platform, such as making changes to settings, adding other users, or accessing protected APIs. When you are setting up this integration, it is important to understand how audit logs are captured, based on how a customer's Snyk account is set up:
* For customers using Snyk with a single Snyk Organization (or with multiple disconnected Organizations), all audit logs are captured within the scope of the single Organization.
* For customers who have a Snyk Group with child Organizations, actions such as adding new Organizations to the group or adding users to the group are audited at the Group level, and are not typically associated with an Organization.
This integration supports both use cases:
1. Integrate CloudTrail Lake with a single Snyk Organization
1. All audit logs associated directly with that Organization will be sent to CloudTrail Lake.
2. If the Organization has a parent Group, actions taken on that Group **a**re not sent to CloudTrail Lake.
3. If the Organization has members who are also members of other Organizations and Groups, actions taken by those members will only be sent to CloudTrail Lake if they are directly associated with the Organization.
2. Integrate CloudTrail Lake with a Snyk Group and all of its child Organizations
1. All audit logs associated with the Group or any of its child Organizations will be sent to CloudTrail Lake.
2. When new Organizations are added to the Group, audit logs for those Organizations will be sent automatically to CloudTrail Lake.
## Add a Snyk integration in AWS CloudTrail Lake
To get started setting up a CloudTrail Lake integration, whether for a group or a single Organization, follow the setup [instructions](https://docs.aws.amazon.com/awscloudtrail/latest/userguide/query-event-data-store-integration.html) in the AWS CloudTrail Lake documentation, choosing Snyk as the integration type.
Choose Snyk, Add integration for Snyk
During the setup, you must supply an **External ID** for the integration. The value for this ID depends on whether you are setting up the integration for a single Snyk Organization, or for a Snyk Group that includes all child Organizations.
### External ID for a Single Snyk Organization
If you are creating this integration for a single Snyk Organization, you will use your Snyk **Organization ID** as the **External ID.** You can find your Organization ID under Snyk **Organization Settings**.
Organization ID on Snyk Organization settings page
Copy the value in the **Organization ID** field to the **External ID** field in the AWS CloudTrail Lake integration setup and continue following the instructions in the AWS CloudTrail Lake documentation.
### External ID for a Snyk group
If you are setting up this Organization for a Snyk Group, which will automatically include all child organizations, you will use your **Snyk Group ID** as the **External ID**. You can find your Group ID by clicking on the name of your Snyk group in the Snyk dashboard, and then navigating to the **Settings** page.
Group settings page
Copy the value in the **Group ID** field to the **External ID** field in the AWS CloudTrail Lake integration setup and continue following the instructions in the AWS CloudTrail Lake documentation.
### CloudTrail Lake Channel ARN
When you are finished creating the Snyk integration in AWS CloudTrail Lake, copy the **Channel ARN** that is displayed on the integration page. You will need this for the next step.
## Configure the integration in Snyk (single Organization)
After creating the integration in **AWS CloudTrail Lake**, you can complete the setup in the Snyk dashboard.
To do this, go to [the Snyk integrations page](https://app.snyk.io/integrations), navigate to **Cloud events**, and click the **AWS CloudTrail Lake** tile:
CloudTrail Lake tile on Snyk integrations page
Enter a **name** for this integration, your **AWS Account ID**, and the **Channel ARN** from the previous step.
Integration name, AWS Account ID, Channel ARN
After this step is complete, Snyk immediately begins forwarding audit logs to AWS CloudTrail Lake. You can click **View settings** or go to the [AWS CloudTrail Lake settings](https://app.snyk.io/manage/integrations/aws-cloudtrail) page to view and manage the integration.
## Snyk App authorization
If this is the first time you have set up an AWS CloudTrail Lake integration for your Organization, you will be prompted to complete the Snyk App authorization flow.
Snyk App authorization
After completing the authorization flow you will be redirected to the settings page for the integration.
## Configure the integration in Snyk (Snyk Group and child Organizations)
{% hint style="info" %}
Configuring and managing this integration for a group is only supported by the Snyk REST API.
{% endhint %}
To complete the setup of the integration for a Snyk Group, you must use the API endpoint [Create a group registration](https://apidocs.snyk.io/experimental?version=2023-05-29%7Eexperimental#post-/groups/-group_id-/cloud_events/group_registrations).
You can use this sample request as a starting point:
```sh
curl --location --request POST 'https://api.snyk.io/rest/groups//cloud_events/group_registrations?version=2023-01-25~experimental' \
--header 'Content-Type: application/vnd.api+json' \
--header 'Authorization: token ' \
--data-raw '{
"data": {
"type": "group_registration",
"attributes": {
"type": "aws-cloudtrail",
"name": "",
"config": {
"account_id": "",
"channel_arn": ""
}
}
}
}'
```
Be sure to replace each indicated placeholder value in the example appropriately:
* `` - the Snyk **Group ID** you used in the previous step as the **External ID**
* `` - your personal Snyk API token, which you can find in the Snyk dashboard under **Account settings**
* `` - a name for this integration
* `` - the AWS account ID for the AWS account form the previous step.
* `` - the **Channel ARN** generated in the previous step when you added the Snyk integration in the CloudTrail Lake console.
If the call is successful, the API response will include an `id` for the registration. You can use this ID to manage and delete the integration later.
## Remove an AWS CloudTrail Lake integration (single Organization)
Navigate to the [AWS CloudTrail Lake settings](https://app.snyk.io/manage/integrations/aws-cloudtrail) page and select the name of the integration you want to remove.
Select AWL CloudTrail Lake integration to remove
Select **Remove integration** and confirm that you want to remove the integration.
Remove integration button
This action removes Snyk’s configuration for this integration, which will prevent any further audit logs from being sent to AWS CloudTrail Lake. This does not remove the Snyk integration in AWS CloudTrail Lake. To do this, navigate to AWS CloudTrail Lake and delete the Snyk integration from the **Integration** list.
## Remove an AWS CloudTrail Lake integration (Snyk Group and child Organizations)
Configuring and managing this integration for a Group is supported only by the Snyk API. You can remove an integration using the endpoint [Delete a group registration](https://apidocs.snyk.io/experimental?version=2023-05-29%7Eexperimental#delete-/groups/-group_id-/cloud_events/group_registrations/-group_registration_id-). For tips on how to use the API, see the section [about configuring a group-level integration](#configure-the-integration-in-snyk-snyk-group-and-child-organizations).
{% hint style="info" %}
To delete a Group-level integration, retrieve the integration ID. This is the same ID that is returned by the API when you create a Group-level integration. You can also get all currently configured Group integrations with the endpoint [List all group registrations](https://apidocs.snyk.io/experimental?version=2023-05-29%7Eexperimental#get-/groups/-group_id-/cloud_events/group_registrations).
{% endhint %}
## Query Snyk audit logs in AWS CloudTrail Lake
When your Snyk audit logs are being forwarded to AWS CloudTrail Lake, you can access them with the AWS CloudTrail Lake Query functionality. You can use this example query to get started:
```sql
select
eventtime,
eventdata.useridentity,
eventdata.eventname,
eventdata.additionaleventdata
from
order by eventTime desc
limit 10
```
Replace `` with the ID of the event data store that is associated with the Snyk integration in AWS CloudTrail Lake.
## Understanding the log data
There are three (3) key fields to note when using the Snyk audit log data in AWS CloudTrail Lake.
`eventdata.useridentity`
The event `useridentity` contains a field called `principalid`, which represents the Snyk user ID for the user associated with the audit event. You can use Snyk API v1 endpoint [Get organization level audit logs](https://docs.snyk.io/snyk-api/reference/audit-logs#orgs-org_id-audit_logs-search) to match the Snyk user ID with a user in your Organization.
`eventdata.eventname`
This represents the type of audit event, for example, `api.access` or `org.cloud_config.settings.edit`, and can be used to group or filter events.
`eventdata.additionaleventdata`
This field contains a raw JSON payload with more detailed information about the audit event. The content of the payload depends on the type of the event. For example, an API access event will include the accessed URL, while a settings change event will include before and after values for the changed setting.
---
# Source: https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/aws-codepipeline-integration-by-adding-a-snyk-scan-stage.md
# AWS CodePipeline integration with CodeBuild
This guide outlines the steps for setting up a [Snyk Open Source](https://snyk.io/product/open-source-security-management/) security scanning workflow for AWS CodePipeline using [AWS CodeBuild](https://aws.amazon.com/codebuild/). By using the Snyk CLI and the built-in capabilities of CodeBuild, you can build a configurable solution for running Snyk software composition analysis (SCA) scans in your CI/CD pipeline.
## Prerequisites
* An active AWS account with CodeBuild and CodePipeline services enabled
* A Snyk account with the Snyk CLI configured
* Familiarity with CodeBuild project configuration and environment variables
* Understanding of your existing CodePipeline stages and their desired interaction with Snyk
## Setup steps
### Set up CodeBuild
* Create a new CodeBuild project in your AWS account.
* Choose a compatible [base image](https://docs.aws.amazon.com/codebuild/latest/userguide/build-env-ref-available.html) for your project based on your programming language and dependencies.
* Review how to [authenticate the Snyk CLI with your account](https://docs.snyk.io/developer-tools/snyk-cli/authenticate-to-use-the-cli) and consider using an environment variable to store sensitive information such as your Snyk CLI token.
{% hint style="info" %}
The default Service role in AWS CodeBuild includes an IAM permission that allows the CodeBuild project to pull any secret from the AWS Secrets Manager that starts with `/CodeBuild/` in the name. Refer to the [Troubleshooting](#troubleshooting) section at the end of this guide for more information.
{% endhint %}
* Configure build commands:
* [Install the Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/install-or-update-the-snyk-cli) using the commands appropriate for your operating system.
* Define a build command that executes the Snyk scan using the CLI.
* Define a build command that sends a snapshot of the project to Snyk for continuous monitoring (optional).
* Review the example `buildspec.yaml` that follows for more details:
```yaml
# buildspec.yaml
version: 0.2
phases:
build:
commands:
# install the latest Snyk CLI from GitHub Releases
- latest_version=$(curl -Is "https://github.com/snyk/cli/releases/latest" | grep "^location" | sed 's#.*tag/##g' | tr -d "\r")
- snyk_cli_dl_linux="https://github.com/snyk/cli/releases/download/${latest_version}/snyk-linux"
- curl -Lo /usr/local/bin/snyk $snyk_cli_dl_linux
- chmod +x /usr/local/bin/snyk
# authenticate the Snyk CLI
- snyk auth $SNYK_TOKEN
# perform a Snyk SCA scan; continue if vulnerabilities are found
- snyk test || true
# upload a snapshot of the project to Snyk for continuous monitoring
- snyk monitor
```
### Set up CodePipeline
For some [Open Source](https://snyk.io/product/open-source-security-management/) Projects, you must build the Project before testing it with the Snyk CLI. Review the Snyk [documentation](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/snyk-cli-for-open-source/open-source-projects-that-must-be-built-before-testing-with-the-snyk-cli) to determine whether Snyk requires your Project to be built before running an Open Source scan; then follow the instructions in the corresponding section below:
#### Snyk requires a built Project
* Edit your existing CodePipeline or create a new one.
* Create a new stage to build your Project, or edit the existing build stage.
* Add the commands from the example `buildspec.yaml` to your build stage so that the Snyk scan occurs immediately after the Project is built.
{% hint style="info" %}
The Snyk Open Source scan must be in the same CodeBuild action as the build process to ensure that Snyk has access to the full build workspace.
{% endhint %}
#### Snyk does not require a built Project
* Edit your existing CodePipeline pipeline or create a new one.
* Add a new build stage after your source code acquisition stage.
* Select your newly created CodeBuild Project for this stage.
* Select SourceArtifact under Input artifacts to allow Snyk to scan the source code directly.
### Result handling
Using the [Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/commands) in CodeBuild gives you full access to its functionality and options. To get started with basic result handling, you can follow these tips:
* The `snyk test` command produces a non-zero exit code when vulnerabilities are found. Consider adding `|| true` to the end of the command to circumvent this behavior.
* The [snyk-to-html](https://github.com/snyk/snyk-to-html) tool can be used to produce an HTML report of scan results by running a command similar to `snyk test --json | snyk-to-html -o snyk-results.html` . For more information, see the [`snyk-to-html`](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-to-html) documentation.
* Consider the following CLI options for common usage patterns:
* [--org=\](https://docs.snyk.io/snyk-cli/commands/test#org-less-than-org_id-greater-than) - Specify the \ to run Snyk commands tied to a specific Snyk Organization.
* [--severity-threshold=\](https://docs.snyk.io/snyk-cli/commands/test#severity-threshold-less-than-low-or-medium-or-high-or-critical-greater-than) - Report vulnerabilities only at the specified level or higher.
* [--all-projects](https://docs.snyk.io/snyk-cli/commands/test#all-projects) - Auto-detect all Projects in the working directory.
* [--project-name=\](https://docs.snyk.io/snyk-cli/commands/monitor#project-name-less-than-project_name-greater-than) - Specify a custom Snyk Project name to the `snyk monitor` command.
### Test and validate
* Trigger a manual build in your CodePipeline to test the new CodeBuild integration.
* Verify that the Snyk scan executes successfully and outputs results as expected.
* Ensure your subsequent pipeline stages handle the scan output appropriately.
### Deployment
* When testing is complete, consider deploying the updated CodePipeline.
* Monitor your pipeline for successful Snyk scan execution and address any integration issues.
### Next Steps
Refer to the [Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli) documentation to incorporate additional security scans into your CI/CD pipeline.
## Conclusion
By following these steps and considerations, you can successfully integrate Snyk security scanning into your AWS CodePipeline pipelines.
## Troubleshooting
Question: How do I store the Snyk token in AWS Secrets Manager and use it in AWS CodeBuild?
If you use an AWS Secrets Manager environment variable, store your token in AWS Secrets Manager as plain text, and ensure that your AWS CodeBuild service role has the `secretsmanager:GetSecretValue` permission in IAM. Set the `value` of the environment variable in AWS CodeBuild to the `Secret name` in AWS Secrets Manager.
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration/aws-integration-api.md
# AWS Integration: API
Before you can onboard an AWS account to Snyk via the API, you need access to the AWS account and associated credentials with permissions to create a read-only Identity and Access Management (IAM) role. See the prerequisites on the [AWS integration](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration) page.
The steps follow to onboard an AWS account via the API:
1. [Download an infrastructure as code (IaC) template](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration/aws-integration-api/step-1-download-iam-role-iac-template-api): to give Snyk permissions to scan your account.
2. [Create an AWS IAM role](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration/aws-integration-api/step-2-create-the-snyk-iam-role-api): using the template you downloaded.
3. [Create and scan a Cloud Environment.](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration/aws-integration-api/step-3-create-and-scan-a-cloud-environment-api)
When you have completed the steps, you will be able to do the following:
* View the cloud configuration issues Snyk finds. See [Manage cloud issues](https://docs.snyk.io/scan-with-snyk/snyk-iac/getting-started-with-cloud-scans/manage-cloud-issues).
* Prioritize your vulnerabilities with cloud context.
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration/aws-integration-web-ui.md
# AWS Integration: Web UI
Before you can onboard an AWS account via the Web UI, you need access to the AWS account and associated credentials with permissions to create a read-only Identity and Access Management (IAM) role. See the prerequisites on the [AWS integration](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration) page.
The steps follow to onboard an AWS account via the Web UI:
1. [Download an infrastructure as code (IaC) template](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration/aws-integration-web-ui/step-1-download-iam-role-iac-template-web-ui) to give Snyk permissions to scan your account.
2. [Create an AWS IAM role](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration/aws-integration-web-ui/step-2-create-the-snyk-iam-role) using the template you downloaded.
3. [Create and scan a Cloud Environment](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration/aws-integration-web-ui/step-3-create-and-scan-a-cloud-environment-web-ui).
When you have completed the steps, you will be able to do the following:
* View the cloud configuration issues Snyk finds. See [Manage cloud issues](https://docs.snyk.io/scan-with-snyk/snyk-iac/getting-started-with-cloud-scans/manage-cloud-issues).
* Prioritize your vulnerabilities with cloud context.
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration.md
# AWS integration
Snyk integrates with your [Amazon Web Services (AWS)](https://aws.amazon.com/) account to find issues in your cloud configurations and to generate cloud context to help you prioritize your vulnerabilities.
You can onboard an AWS account to Snyk using the following methods:
* [Snyk Web UI](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration/aws-integration-web-ui)
* [Snyk API](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/aws-integration/aws-integration-api)
To set up an AWS integration, you need the following:
* A Snyk Enterprise [plan](https://snyk.io/plans/)
* A new Snyk Organization, with appropriate feature flags assigned by your Snyk contact
* A Snyk Group Administrator or Organization Administrator [role](https://docs.snyk.io/snyk-platform-administration/user-roles/pre-defined-roles)
* Access to an [AWS](https://aws.amazon.com/) account and associated credentials with permissions to create a read-only IAM role
* Access to the [Terraform CLI](https://www.terraform.io/downloads), [AWS CLI](https://aws.amazon.com/cli/), or [AWS Management Console](https://console.aws.amazon.com) to create the IAM role for Snyk via Terraform or AWS CloudFormation
* If you are using Terraform or the AWS CLI, ensure you configure it with your AWS credentials. See the instructions for [Terraform](https://registry.terraform.io/providers/hashicorp/aws/latest/docs#authentication-and-configuration) or the [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html)
* API only: An Organization-level [service account](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts) with an Org Admin role to use the Snyk API
* API only: An API client such as [curl](https://curl.se/), [HTTPie](https://httpie.io/), or [Postman](https://www.postman.com/)
---
# Source: https://docs.snyk.io/snyk-api/using-specific-snyk-apis/webhooks-apis/guides-to-webhooks/how-to-use-snyk-webhooks-to-connect-snyk-to-slack-with-aws-lambda/aws-lambda-setup-add-security-through-an-environment-variable.md
# AWS Lambda setup: add security through an environment variable
For security reasons the script that you created uses an environment variable: `hmac_verification` with a shared secret to validate the payload is coming from Snyk and has not been tampered with.
Follow these steps to add security through an environment variable:
1. Go to the **Configuration** tab in the AWS Lambda function.
2. Click **Environment variables**.
3. Add the new environment variable.
4. Use `hmac_verification` as your key.
5. Enter your preferred secret as the key value. Store this secret somewhere safe for use again later.
AWS Lambda function add environment variable interface
6. For added security, consider using Secrets Manager or Parameter Store for the shared secret.
---
# Source: https://docs.snyk.io/snyk-api/using-specific-snyk-apis/webhooks-apis/guides-to-webhooks/how-to-use-snyk-webhooks-to-connect-snyk-to-slack-with-aws-lambda/aws-lambda-setup-create-lambda-function-to-connect-snyk-to-slack.md
# AWS Lambda setup: create Lambda function to connect Snyk to Slack
AWS Lambda functions are used to connect Snyk to Slack because these functions are an inexpensive and efficient way of running code triggered by events, for example when there is a new Snyk vulnerability.
**Note:** If publishing the Lambda function through API Gateway, both must be configured in the same region. You can check this on the top right of the AWS Console.
Start by creating a zip file containing the code for the function and the necessary dependencies.
1. Save these two code blocks as package.json and index.js
1. package.json (modify to fit any other dependencies needed for your code, the dependencies needed are **axios** and **crypto**)
```json
{
"name": "snyk-webhook-handler",
"version": "1.0.0",
"description": "Snyk to Slack Webhook Integration",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "",
"license": "ISC",
"dependencies": {
"axios": "^1.1.3",
"crypto": "^1.0.1"
}
}
```
2. index.js
```javascript
const crypto = require('crypto')
const axios = require('axios')
let slackWebhookUrl = '' // adjust
//customised messaging to Slack with issue information, modify as needed
async function messageSlack(
message,
snykProjectUrl,
snykProjectName,
snykIssuePackage,
snykIssueUrl,
snykIssueId,
severity,
snykIssuePriority
) {
//strings modified to avoid Axios/Slack errors
snykProjectUrl = snykProjectUrl.replace(/['"]+/g, '')
snykProjectName = snykProjectName.replace(/['"]+/g, '')
snykIssueUrl = snykIssueUrl.replace(/['"]+/g, '')
snykIssueId = snykIssueId.replace(/['"]+/g, '')
snykIssuePackage = snykIssuePackage.replace(/['"]+/g, '')
severity = severity.replace(/['"]+/g, '')
//construct message
let payload = {
text: `${message}`,
blocks: [
{
type: 'header',
text: {
type: 'plain_text',
text: `${message}`,
},
},
{
type: 'section',
text: {
type: 'mrkdwn',
text: `Snyk has found a new vulnerability in the project:\n*<${snykProjectUrl}|${snykProjectName}>*`,
},
},
{
type: 'divider',
},
{
type: 'section',
fields: [
{
type: 'mrkdwn',
text: `*Package name:*\n${snykIssuePackage}`,
},
{
type: 'mrkdwn',
text: `*Vulnerability:*\n<${snykIssueUrl}|${snykIssueId}>`,
},
{
type: 'mrkdwn',
text: `*Severity:*\n${severity}`,
},
{
type: 'mrkdwn',
text: `*Priority Score:*\n${snykIssuePriority}`,
},
],
},
{
type: 'actions',
elements: [
{
type: 'button',
text: {
type: 'plain_text',
text: 'View in Snyk',
},
style: 'primary',
url: snykProjectUrl,
value: 'browseUrl',
},
],
},
],
}
//send message
const res = await axios.post(slackWebhookUrl, payload)
const data = res.data
}
exports.handler = async (event) => {
// Securing integrity of payload, this can be moved to another Lambda function and called seperately for authentication
let response
const {hmac_verification, severity_threshold} = process.env
const hmac = crypto.createHmac('sha256', hmac_verification)
const buffer = JSON.stringify(event.body)
hmac.update(buffer, 'utf8')
const stored_signature = `sha256=${hmac.digest('hex')}`
let sent_signature = event.headers['x-hub-signature']
if (stored_signature !== sent_signature) {
console.log('Integrity of request compromised, aborting')
response = {
statusCode: 403,
body: JSON.stringify('Bad request'),
}
return response
}
// If integrity is ok, verify that the webhook actually contains the project object, iterate and filter
if (buffer.indexOf('project') !== -1 && buffer.indexOf('newIssues') !== -1) {
// Iterate through new issues
var len = buffer['newIssues'] ? buffer['newIssues'].length : 0
for (let x = 0; x < len; x++) {
// Get Severity
let severity = JSON.stringify(buffer['newIssues'][x]['issueData']['severity'])
// Filter
if (severity.includes('high') || severity.includes('critical')) {
let snykProjectName = JSON.stringify(buffer['project'].name)
let snykProjectUrl = JSON.stringify(buffer['project'].browseUrl)
let snykIssueUrl = JSON.stringify(buffer['newIssues'][x]['issueData'].url)
let snykIssueId = JSON.stringify(buffer['newIssues'][x].id)
let snykIssuePackage = JSON.stringify(buffer['newIssues'][x].pkgName)
let snykIssuePriority = JSON.stringify(buffer['newIssues'][x]['priority'].score)
let message = 'New Snyk Vulnerability'
// Send the result to Slack
await messageSlack(
message,
snykProjectUrl,
snykProjectName,
snykIssuePackage,
snykIssueUrl,
snykIssueId,
severity,
snykIssuePriority
)
}
}
}
//do nothing, or modify for any preferable action
else {
console.log('Valid webhook, but project missing or empty')
}
//respond to Snyk
response = {
statusCode: 200,
body: JSON.stringify('Success'),
}
return response
}
```
2. Use the following commands in your terminal:\
\- `cd /path/to/snyk/folder` (to navigate inside the folder where you stored the two files)\
\- `npm install` (to set up a package. json file)\
\- `cd ..` (to navigate back to previous folder)\
\- `zip -r snyk.zip /path/to/snyk/folder` (to zip snyk into a file which can be uploaded to AWS Lambda)
To create an AWS Lambda function, follow these steps:
1. Go to the AWS Console.
2. Navigate to Lambda.
3. Click **Create function**.
4. Choose **Node.js 16.x** for the **Runtime**.
5. Choose **X86\_64** for the **Architecture**.
6. If publishing the Lambda function through API Gateway, add or create a role with the default policy **AmazonAPIGatewayInvokeFullAccess** to interact with the AWS API Gateway.
7. Verify that the AWS Console screen shows these entries:
AWS Console with entries to create a Lambda function
8. Click **Create Function** and when the function is ready click **Upload from .zip file** and
9. Verify that the code you entered is displayed in the Code Source.
AWS code source display
10. In the code, modify the `slackWebhookUrl` to match your Slack webhook URL.
11. For more information on the script you have pasted, go to [Configure the AWS Lambda Script](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/webhooks-apis/guides-to-webhooks/how-to-use-snyk-webhooks-to-connect-snyk-to-slack-with-aws-lambda/configure-the-aws-lambda-script).
---
# Source: https://docs.snyk.io/snyk-api/using-specific-snyk-apis/webhooks-apis/guides-to-webhooks/how-to-use-snyk-webhooks-to-connect-snyk-to-slack-with-aws-lambda/aws-lambda-setup-set-up-the-trigger/with-api-gateway/aws-lambda-setup-set-up-the-trigger.md
# Source: https://docs.snyk.io/snyk-api/using-specific-snyk-apis/webhooks-apis/guides-to-webhooks/how-to-use-snyk-webhooks-to-connect-snyk-to-slack-with-aws-lambda/aws-lambda-setup-set-up-the-trigger.md
# AWS Lambda setup: expose a public URL
For Snyk to be able to send webhooks to the Lambda function you will need a public URL exposing the function. To do this you have two options, [API Gateway](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/webhooks-apis/guides-to-webhooks/how-to-use-snyk-webhooks-to-connect-snyk-to-slack-with-aws-lambda/aws-lambda-setup-set-up-the-trigger/with-api-gateway) or a [Lambda function URL](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/webhooks-apis/guides-to-webhooks/how-to-use-snyk-webhooks-to-connect-snyk-to-slack-with-aws-lambda/aws-lambda-setup-set-up-the-trigger/with-a-lambda-function-url). Choose the option that suits your current environments or needs the best.
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-iac/detect-manually-created-resources/supported-resources/aws-resources.md
# AWS resources
Snyk IaC unmanaged resource scanning supports the following resources for AWS:
| **Resource** |
| ----------------------------------------- |
| aws\_s3\_bucket |
| aws\_s3\_bucket\_analytics\_configuration |
| aws\_s3\_bucket\_inventory |
| aws\_s3\_bucket\_metric |
| aws\_s3\_bucket\_notification |
| aws\_s3\_bucket\_policy |
| aws\_s3\_bucket\_public\_access\_block |
| aws\_instance |
| aws\_key\_pair |
| aws\_ami |
| aws\_ebs\_snapshot |
| aws\_ebs\_volume |
| aws\_eip |
| aws\_eip\_association |
| aws\_lambda\_function |
| aws\_lambda\_event\_source\_mapping |
| aws\_db\_instance |
| aws\_db\_subnet\_group |
| aws\_route53\_record |
| aws\_route53\_zone |
| aws\_route53\_health\_check |
| aws\_iam\_access\_key |
| aws\_iam\_policy |
| aws\_iam\_policy\_attachment |
| aws\_iam\_role |
| aws\_iam\_role\_policy |
| aws\_iam\_role\_policy\_attachment |
| aws\_iam\_user |
| aws\_iam\_user\_policy |
| aws\_iam\_user\_policy\_attachment |
| aws\_iam\_group\_policy |
| aws\_default\_subnet |
| aws\_subnet |
| aws\_default\_vpc |
| aws\_vpc |
| aws\_default\_security\_group |
| aws\_security\_group |
| aws\_security\_group\_rule |
| aws\_route\_table |
| aws\_default\_route\_table |
| aws\_route |
| aws\_route\_table\_association |
| aws\_nat\_gateway |
| aws\_internet\_gateway |
| aws\_sqs\_queue |
| aws\_sqs\_queue\_policy |
| aws\_sns\_topic |
| aws\_sns\_topic\_policy |
| aws\_sns\_topic\_subscription |
| aws\_dynamodb\_table |
| aws\_cloudfront\_distribution |
| aws\_ecr\_repository |
| aws\_kms\_key |
| aws\_kms\_alias |
| aws\_rds\_cluster |
| aws\_cloudformation\_stack |
| aws\_api\_gateway\_rest\_api |
| aws\_api\_gateway\_account |
| aws\_api\_gateway\_api\_key |
| aws\_api\_gateway\_authorizer |
| aws\_api\_gateway\_stage |
| aws\_api\_gateway\_resource |
| aws\_api\_gateway\_domain\_name |
| aws\_api\_gateway\_vpc\_link |
| aws\_api\_gateway\_request\_validator |
| aws\_api\_gateway\_rest\_api\_policy |
| aws\_api\_gateway\_base\_path\_mapping |
| aws\_api\_gateway\_method |
| aws\_api\_gateway\_model |
| aws\_api\_gateway\_method\_response |
| aws\_api\_gateway\_gateway\_response |
| aws\_api\_gateway\_method\_settings |
| aws\_api\_gateway\_integration |
| aws\_api\_gateway\_integration\_response |
| aws\_apigatewayv2\_api |
| aws\_apigatewayv2\_authorizer |
| aws\_apigatewayv2\_deployment |
| aws\_apigatewayv2\_route |
| aws\_apigatewayv2\_vpc\_link |
| aws\_apigatewayv2\_integration |
| aws\_apigatewayv2\_integration\_response |
| aws\_apigatewayv2\_model |
| aws\_apigatewayv2\_stage |
| aws\_apigatewayv2\_route\_response |
| aws\_apigatewayv2\_api\_mapping |
| aws\_apigatewayv2\_domain\_name |
| aws\_appautoscaling\_target |
| aws\_default\_network\_acl |
| aws\_network\_acl |
| aws\_network\_acl\_rule |
| aws\_rds\_cluster\_instance |
| aws\_appautoscaling\_policy |
| aws\_appautoscaling\_scheduled\_action |
| aws\_launch\_template |
| aws\_launch\_configuration |
| aws\_ebs\_encryption\_by\_default |
| aws\_lb |
| aws\_alb |
| aws\_elb |
| aws\_elasticache\_cluster |
---
# Source: https://docs.snyk.io/integrations/event-forwarding/aws-security-hub.md
# AWS Security Hub
The [AWS Security Hub](https://aws.amazon.com/security-hub/) integration sends Snyk issues to Security Hub, allowing you to centralize your security reporting, build custom alerting, and trigger automation. After it is configured, the integration automatically uploads Snyk issues to Security Hub as security findings. When issues are updated or new remediations become available, the corresponding Security Hub findings are automatically updated.
There are two steps required to configure the integration:
1. Configure Security Hub to accept findings from Snyk in the Security Hub console.
2. Configure Snyk to send findings to Security Hub in the Snyk dashboard.
## Configuring Security Hub to accept Snyk findings
Go the the Security Hub console for the AWS account and region you want to receive Snyk findings. Navigate to the **Integrations** section and search for **Snyk**. On the **Snyk** integration tile, click **Accept findings** and follow the prompts.
Search for Snyk integration
After this step is done, you can continue setting up the integration in the Snyk dashboard.
## Configuring Snyk to send findings to Security Hub
Navigate to [the Snyk integrations page](https://app.snyk.io/integrations) and search for **Security Hub** or navigate to the **Cloud events** section. Click on the **Security Hub** tile to start creating a new integration.
Create new Security Hub integration
Enter a **name** for the integration, along with the **AWS Account ID** and **AWS Region** where you enabled the Snyk partner integration in step one.
Enter integration details
After this step is complete, Snyk begins sending new issue events to Security Hub.
{% hint style="info" %}
Issues on existing Projects will not be sent to Security Hub unless those issues are updated. To backfill issues from existing projects, you can delete and re-import them.
{% endhint %}
## Snyk App authorization
If this is the first time you have set up an AWS Security Hub integration for your Organization, you will be prompted to complete the Snyk App authorization flow.
Snyk App authorization
After completing the authorization flow you will be redirected to the settings page for the integration.
## Managing and deleting a Security Hub integration
Navigate to the [Security Hub integration settings page](https://app.snyk.io/manage/integrations/aws-securityhub) in the Snyk dashboard and click on the name of the integration you want to manage.
Select integration to manage
Clicking on the name of an integration opens the settings page for that integration, where you can view and update configuration information for the integration.
To delete an integration, scroll to the bottom of the integration settings page and click the **Remove integration** button.
Remove integration
After the integration is deleted, Snyk will no longer send issues to Security Hub. Issues that have already been sent to Security Hub will remain there until they are archived.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/single-sign-on-sso-for-authentication-to-snyk/configure-self-serve-single-sign-on-sso/azure-ad-enterprise-application-setup.md
# Entra ID Enterprise application setup
This example shows setting up an Entra ID (formerly Azure AD) Enterprise Application and connecting this to Snyk to facilitate SSO. To configure your Azure Enterprise Application to use SSO with Snyk, first obtain an entity ID and a reply URL (Assertion Consumer Service URL) from Snyk.
1. From the dropdown at the top left select **GROUP OVERVIEW** and then the cog icon (top right corner) to get to your group settings.
Select group overview
2. Click on **SSO** and copy the values under **Entity ID** and **ACS URL** or leave the browser tab open for easy access.
Group Settings: SSO
3. Navigate to Azure and open Entra ID.
Entra ID Default Directory
4. Click **Add** then **Enterprise application**.
Add Enterprise application
5. Choose **Create your own application**.
Create your own application
6. Name the application appropriately, for example, **Snyk-SSO**, making sure that **Integrate any other application you don't find in the gallery (Non-gallery)** is selected and then click **Create**.
Application name and integration
7. For the new app, select **Set up single sign on** and **Get started**.
Set up single sign-on, Get started
8. Select **SAML** as the SSO method.
Select SAML
9. Click **Edit** under **Basic SAML configuration**.
Edit basic SAML configuration
10. Add the Identity (Entity ID) and reply URL (Assertion Consumer Service URL) you obtained from Snyk and click **Save**; then close the edit window.
Entity ID and Assertion Consumer Service URL
11. Scroll to find the login URL needed to finish the configuration in Snyk. Copy it and paste it into the SSO settings in the Snyk portal.
Login URL
Sign in URL in Snyk portal
12. Return to Entra ID and click **Download** next to **Certificate (Base64)**.
Download SAML Certificate (Base 64)
13. Open the downloaded certificate in your preferred text editor, copy the text and paste it into the Snyk **X509 signing certificate** field, and add the relevant domains that are supported by this SSO connection.\
Finally, verify if an **IdP-initiated workflow** should be enabled and then click **Create Auth0 connection** if you are creating a completely new connection or **Save changes** if you are editing an existing connection.
Enter certificate and domains supported, set connection
14. Decide how new users should be treated when signing in and choose the option you would like to use: **Group member**, **Org collaborator**, or **Org admin**. Finally, modify the **profile attributes** if your settings in Azure deviate from the default; then click **Save changes** and verify you can log in, either with the direct URL at the top of step 3 or by going to the [generic SSO login](https://app.snyk.io/login/sso).\
\
If you are not receiving profile values as expected, you may need to add email, name, and username as additional claims within Azure SSO settings and then map those accordingly in the Snyk SSO **Profile attributes** section.
Azure claim settings
Profile attributes section
If you wish to add signature verification of the incoming Snyk request:
1. Download the **Signing certificate** at step 1 of the Snyk SSO settings.
2. Use the following openssl command to convert it to .cer-format `openssl x509 -outform DER -in snyk.pem -out snyk.cer`
3. At the bottom of the **SAML Certificates** settings of your SSO app in Active Directory, click **Edit** next to **Verification certificates.**
4. Check **Require verification certificates** and upload the certificate from the output of the above openssl command and click **Save**.
---
# Source: https://docs.snyk.io/developer-tools/scm-integrations/group-level-integrations/azure-devops-for-snyk-essentials.md
# Azure DevOps for Snyk Essentials
The Integrations page shows all active integrations, including data from your existing Snyk Organizations that is automatically synced and provides access to the Integration Hub.
## Pulled entities
#### Prerequisites
To configure a Group-level integration, you must be a Group Admin or have a custom role that includes the `Edit Snyk Essentials` permissions under the [Group-level permissions](https://docs.snyk.io/snyk-platform-administration/user-roles/pre-defined-roles#group-level-permissions).
Repository - the pulled entity retrieved by Snyk Essentials.
## Integrate using Snyk Essentials
1. Profile name (`mandatory`): Input your integration profile name.
2. Organizations (`mandatory`): Input the names of all the relevant Azure DevOps organizations.
3. Access Token (`mandatory`): Create your Azure DevOps PAT from your Azure DevOps settings.
* Access Token (`mandatory`): Create and add your Access token by following the instructions from the [Generate a Personal access token from your Azure DevOps settings](#generate-a-personal-access-token-from-your-azure-devops-settings) section.
* API URL (`mandatory`): The API URL, for example, [`https://dev.azure.com/`](https://dev.azure.com/). You can use a custom URL that is publicly accessible.
4. Broker Token (`mandatory`): Create and add your Broker token if you use Snyk Broker.
* Generate your Broker token by following the instructions from the [Obtain your Broker token for Snyk Broker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/prepare-snyk-broker-for-deployment/obtain-the-tokens-required-to-set-up-snyk-broker) page.
* Copy and paste the Broker token on the integration setup menu from the Integration Hub.
5. Add Backstage Catalog (`optional`): If you want to add your Backstage catalog, follow the instructions from the [Backstage file for SCM Integrations](https://docs.snyk.io/developer-tools/scm-integrations/application-context-for-scm-integrations) page.
{% hint style="warning" %}
The following PAT token permissions requirements are for Snyk Essentials integrations. For SCM integration, see the [Azure Respositories (TFS) permissions requirements](https://docs.snyk.io/developer-tools/user-permissions-and-access-scopes#azure-repositories-tfs-permission-requirements) on the Snyk SCM integrations pages.
{% endhint %}
## Generate a Personal access token from your Azure DevOps settings
{% hint style="warning" %}
The user account that owns the PAT needs `Basic` access level on the Azure organisation.
{% endhint %}
1. Open Azure DevOps and click the **Settings** menu for your profile.
2. Click **Personal access tokens** and then **New token**.
3. Select the following scopes:
* Permissions
* **Code** - read
* **Project and Team** - read
* **Analytics** - read
* **Member Entitlement Management** - read
* Organization - Select **All accessible organizations** or a specific organization.
4. Set the expiration to 12 months.
5. Copy the generated personal access token and share it through a secured vault.
## API version
You can use the[ Azure DevOps REST API v6](https://learn.microsoft.com/en-us/rest/api/azure/devops/core/?view=azure-devops-rest-6.0) repository to access information about the API.
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/azure-devops.md
# Azure DevOps
* [Flow and Tech](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/azure-devops/azure-flow-and-tech)
* [Examples](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/azure-devops/azure-examples)
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/azure-devops/azure-examples.md
# Azure - Examples
The following options are available for the `snyk-scm-contributors-count azure devops` command:
```
--version Show version number [boolean]
--help Show help [boolean]
--token Azure DevOps token [required]
--org Your Org name in Azure DevOps, for example, https://dev.azure.com/{OrgName} [required]
--projectKeys [Optional] Azure DevOps project key/name to count
contributors for
--repo [Optional] Specific repo to count only for
--exclusionFilePath [Optional] Exclusion list filepath
--json [Optional] JSON output, required when using the "consolidateResults" command
--skipSnykMonitoredRepos [Optional] Skip Snyk monitored repos and count contributors for all repos
--importConfDir [Optional] Generate an import file with the unmonitored repos: A path to a valid folder for the generated import files
--importFileRepoType [Optional] To be used with the importConfDir flag: Specify the type of repos to be added to the import file. Options: all/private/public. Default: all
```
## Before running the command
1. Export SNYK\_TOKEN (if you want to get the contributors only for repos that are already monitored by Snyk):
* Make sure that your token has Group level access or use a service account's token that has Group level access. To learn more about how to create a service account, refer to [Service accounts](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts).
* Copy the token value.
* Export the token in your environment:
```
export SNYK_TOKEN=
```
2. Get your Azure Devops Token and Org:
* Create a Token if one does not exist, using this [guide](https://docs.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate?view=azure-devops\&tabs=preview-page).
**Note**: Make sure your token has read access to the repos.
* Find your Org name in Azure listed on the left pane on the [Azure DevOps site](https://dev.azure.com).
## Running the command
Consider the following levels of usage and options:
### Usage levels
* To get commits for all projects and their repos under my Org in Azure , only provide the Azure token and Azure Org:
```
snyk-scm-contributors-count azure-devops --token AZURE-TOKEN --org AZURE-ORG
```
* To get commits for some projects and their repos under my Org in Azure , provide the Azure token, Azure Org and the project key/s separated by a comma:
```
snyk-scm-contributors-count azure-devops --token AZURE-TOKEN --org AZURE-ORG --projectKeys Key1,Key2...
```
* To get commits for a specific repo under my Org in Azure , provide the Azure token, Azure Org, a project key and a repo name:
```
snyk-scm-contributors-count azure-devops --token AZURE-TOKEN --org AZURE-ORG --projectKeys Key1 --repo Repo1
```
### Options
* To get all the commits from Azure regardless of the repos that are already monitored by Snyk, add the `--skipSnykMonitoredRepos` flag.
You might have repos in Azure that are not monitored in Snyk; use this flag to skip checking for Snyk monitored repos and go directly to Azure to fetch the commits.
```
snyk-scm-contributors-count azure-devops --token AZURE-TOKEN --org AZURE-ORG --skipSnykMonitoredRepos
```
* To exclude some contributors from being counted in the commits, add an exclusion file with the emails to ignore (separated by a new line) and apply the `--exclusionFilePath` with the path to that file:
```
snyk-scm-contributors-count azure-devops --token AZURE-TOKEN --org AZURE-ORG --projectKeys Key1,Key2 --exclusionFilePath PATH_TO_FILE
```
* To set the output to json format: add the `--json` flag:
```
snyk-scm-contributors-count azure-devops --token AZURE-TOKEN --org AZURE-ORG --projectKeys Key1 --repo Repo1 --json
```
* To create an import file for your unmonitored repos: add the `--importConfDir` flag with a valid (writable) path to a folder in which the import files will be stored and add the `--importFileRepoType` flag (optional) with the repo types to add to the file (`all`/`private`/`public`, defaults to `all`). Note that these flags **can not** be set with the `--repo flag.`
```
snyk-scm-contributors-count azure-devops --token AZURE-TOKEN --org AZURE-ORG --importConfDir ValidPathToWritableFolder --importFileRepoType private/public/all
```
For more details about these flags, refer to the [Creating and using the import page](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/creating-and-using-the-import-file).
* To run in debug mode for verbose output, prefix the command with`DEBUG=snyk*`:
```
DEBUG=snyk* snyk-scm-contributors-count azure-devops --token AZURE-TOKEN --org AZURE-ORG --projectKeys Key1 --repo Repo1 --exclusionFilePath PATH_TO_FILE --skipSnykMonitoredRepos --json
```
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/azure-devops/azure-flow-and-tech.md
# Azure - Flow and Tech
## Flow
1. Fetch the monitored projects from Snyk (if the `skipSnykMonitoredRepos` flag was **not set** and the `SNYK_TOKEN` was exported).
2. Fetch `one`/`some`/`all` the projects that the credentials have access to from SCM and create a projects list.
3. Fetch `one`/`all` repos under the fetched/provided Projects.
4. Remove the repos that are not monitored by Snyk (if the `skipSnykMonitoredRepos` flag was **not set** and the `SNYK_TOKEN` was exported) and create a Repo list.
5. Create an import file for unmonitored repos to use for easily importing repos into your Snyk account (if the `importConfDir` flag was set).
6. Fetch the commits for the fetched/provided repo/s and create a Contributors list.
7. Count the commits for the repo/s by the contributors.
8. Remove the contributors that were specified in the exclusion file (if `the exclusionFilePath` flag was set and a valid path to a text file was provided).
9. Print the results.
## Azure API endpoints used
* To get the Projects from Azure: `{Org}/_apis/projects`
* To get the list of the repo/s that correlate with the fetched/provided project list: `{Project}/_apis/git/repositories`
* To get the commits for the fetched/provided repo/s list:`{Project}/_apis/git/repositories/{Repo}/commits?$searchCriteria.fromDate={ThreeMonthsDate}`
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/azure-integration-for-cloud-configurations/azure-integration-api.md
# Azure Integration: API
To onboard an Azure subscription to Snyk via the API:
1. [Download an infrastructure as code (IaC) template or Bash script:](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/azure-integration-for-cloud-configurations/azure-integration-api/step-1-download-azure-app-registration-iac-template-or-script-api) to give Snyk permissions to scan your subscription.
2. [Create an Entra ID app registration:](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/azure-integration-for-cloud-configurations/azure-integration-api/step-2-create-the-entra-id-app-registration-api) using the template or script you downloaded.
3. [Create and scan a Cloud Environment.](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/azure-integration-for-cloud-configurations/azure-integration-api/step-3-create-and-scan-a-cloud-environment-for-azure-api)
When you have completed the steps, you will be able to do the following:
* View the cloud configuration issues Snyk finds. See [Manage cloud issues](https://docs.snyk.io/scan-with-snyk/snyk-iac/getting-started-with-cloud-scans/manage-cloud-issues).
* Prioritize your vulnerabilities with cloud context.
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/azure-integration-for-cloud-configurations.md
# Azure integration for cloud configurations
Snyk integrates with your [Microsoft Azure](https://azure.microsoft.com/en-us/) subscription to find issues in your cloud configurations and generate cloud context to help you prioritize your vulnerabilities.
You can onboard an Azure subscription to Snyk using the following methods:
* [Snyk Web UI](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/azure-integration-for-cloud-configurations/azure-integration-web-ui)
* [Snyk API](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/azure-integration-for-cloud-configurations/azure-integration-api)
## Prerequisites for Azure integration for cloud configurations
To add an Azure integration for cloud configurations, you need the following:
* A Snyk Business or Enterprise [plan](https://snyk.io/plans/)
* A new Snyk Organization with appropriate feature flags assigned by your Snyk contact
* A Snyk Group Administrator or Organization Administrator [role](https://docs.snyk.io/snyk-platform-administration/user-roles/pre-defined-roles)
* Access to a [Microsoft Azure](https://azure.microsoft.com/en-us/) subscription and associated credentials with permissions to create the following resources:
* [An Active Directory (AD) application registration](https://learn.microsoft.com/en-us/azure/active-directory/develop/app-objects-and-service-principals#application-registration)
* [A federated identity credential](https://learn.microsoft.com/en-us/azure/active-directory/develop/workload-identity-federation)\
If you are using Terraform to create this [resource](https://registry.terraform.io/providers/hashicorp/azuread/latest/docs/resources/application_federated_identity_credential#api-permissions), your user must have either the Application Administrator or Global Administrator directory role.
* [A service principal](https://learn.microsoft.com/en-us/azure/active-directory/develop/app-objects-and-service-principals#service-principal-object) with read-only permissions\
If you are .using Terraform to create this [resource](https://registry.terraform.io/providers/hashicorp/azuread/latest/docs/resources/service_principal), your user must have either the Application Administrator or Global Administrator directory role.
* Access to the [Terraform CLI](https://www.terraform.io/downloads) or Azure CLI ([locally](https://learn.microsoft.com/en-us/cli/azure/) or via the [Cloud Shell](https://portal.azure.com/#home)) to create the above resources via Terraform or Bash script\
If you are using Terraform or the Azure CLI locally, ensure you configure it with your Azure credentials. See the instructions for [Terraform](https://registry.terraform.io/providers/hashicorp/azuread/latest/docs#authenticating-to-azure-active-directory) or the [Azure CLI](https://learn.microsoft.com/en-us/cli/azure/authenticate-azure-cli).
* API only: An Organization-level [service account](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts) with an Org Admin role to use the Snyk API
* API only: An API client such as [curl](https://curl.se/), [HTTPie](https://httpie.io/), or [Postman](https://www.postman.com/)
* API only, optional: [jq](https://stedolan.github.io/jq/), to unescape JSON containing the Terraform template or Bash script
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/azure-integration-for-cloud-configurations/azure-integration-web-ui.md
# Azure Integration: Web UI
The steps follow to onboard an Azure subscription to Snyk via the API:
1. [Download an infrastructure as code (IaC) template or Bash script:](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/azure-integration-for-cloud-configurations/azure-integration-web-ui/step-1-download-azure-app-registration-iac-template-or-script-web-ui) to give Snyk permissions to scan your subscription.
2. [Create an Entra ID app registration:](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/azure-integration-for-cloud-configurations/azure-integration-web-ui/step-2-create-the-entra-id-app-registration) using the template or script you downloaded.
3. [Create and scan a Cloud Environment.](https://docs.snyk.io/scan-with-snyk/snyk-iac/cloud-platform-integrations/azure-integration-for-cloud-configurations/azure-integration-web-ui/step-3-create-and-scan-a-cloud-environment-for-azure-web-ui)
When you have completed the steps, you will be able to do the following:
* View the cloud configuration issues Snyk finds. See [Manage cloud issues](https://docs.snyk.io/scan-with-snyk/snyk-iac/getting-started-with-cloud-scans/manage-cloud-issues).
* Prioritize your vulnerabilities with cloud context.
---
# Source: https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration.md
# Azure Pipelines integration using the Snyk Security Scan task
Snyk enables security across the Microsoft Azure ecosystem, including Azure Pipelines, by automatically finding and fixing application and container vulnerabilities.
The Snyk Security Scan task is available for all languages supported by Snyk and Azure DevOps.
{% hint style="info" %}
The Snyk Security Scan task supports Snyk Open Source, Snyk Container, and Snyk Code. If you plan to include other products in your pipeline, use the [Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli).
{% endhint %}
Ready-to-use tasks for Azure Pipelines can be inserted quickly and directly from the Azure interface, enabling you to customize and automate your pipelines with no extra coding. Among the tasks included is the Snyk task.
You can include the Snyk task in your pipeline to test for security vulnerabilities and open-source license issues as part of your routine work. In this way, you can test and monitor your application dependencies and container images for security vulnerabilities. When the testing is done you can review and work with results directly from the Azure Pipelines output, as well as from the Snyk interface.
For setup and use details, see the following pages:
* [How the Snyk Security Scan task works](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration/how-the-snyk-security-scan-task-works)
* [Install the Snyk extension for your Azure pipelines](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration/install-the-snyk-extension-for-your-azure-pipelines)
* [Add the Snyk Security Task to your pipelines](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration/add-the-snyk-security-task-to-your-pipelines)
* [Snyk Security Scan task parameters and values](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration/snyk-security-scan-task-parameters-and-values)
* [Custom API endpoints](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration/regional-api-endpoints)
* [Example of a Snyk task to test a node.js (npm)-based application](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration/example-of-a-snyk-task-to-test-a-node.js-npm-based-application)
* [Simple example of a Snyk task to test an application](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration/simple-example-of-a-snyk-task-to-test-an-application)
* [Example of a Snyk task for a container image pipeline](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration/example-of-a-snyk-task-for-a-container-image-pipeline)
* [Simple example of a Snyk task to test a container image](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration/simple-example-of-a-snyk-task-to-test-a-container-image)
* [Example of a Snyk test to test application code](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration/example-of-a-snyk-task-to-test-application-code)
* [Simple example of a Snyk task to run code test](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration/simple-example-of-a-snyk-task-to-run-a-code-test)
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/azure-repos-prerequisites-and-steps-to-install-and-configure-broker/azure-repos-environment-variables-for-snyk-broker.md
# Azure Repos - environment variables for Snyk Broker
The following environment variables are required to configure the Broker Client for Azure Repos:
* `BROKER_TOKEN` - the Snyk Broker token, obtained from your Azure Repos integration settings view (app.snyk.io).
* `BROKER_SERVER_URL` - the URL of the Broker server for the region in which your data is hosted. For the commands and URLs to use, see [Broker URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#broker-server-urls).
* `AZURE_REPOS_TOKEN` - an Azure Repos personal access token. Refer to this [Guide](https://docs.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate?view=azure-devops\&tabs=preview-page) for how to get or create the token. Required scopes: ensure Custom defined is selected and under Code select `Read & write`.
* `AZURE_REPOS_ORG` - Organization name, which can be found on your Organization Overview page in Azure. For Azure Repos on-prem, the typical organization name is `tfs/Main.` If you have more than one Azure organization, you must deploy a Broker for each one
* `AZURE_REPOS_HOST` - the hostname of your Azure Repos Server deployment, such as `your.azure-server.domain.com`.
* `PORT` - the local port at which the Broker client accepts connections. Default is 8000.
* `BROKER_CLIENT_URL` - the full URL of the Broker client as it will be accessible to your Azure Repos' webhooks, such as `http://broker.url.example:8000.`This URL is required to access features such as PR Fixes or PR Checks.
* This must have http\:// and the port number.
* To configure the client with HTTPS, [additional settings are required](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/https-for-broker-client-with-docker).
* `ACCEPT_IAC` - by default, some file types used by Infrastructure-as-Code (IaC) are not enabled. To grant the Broker access to IaC files in your repository, for example, Terraform, you can add an environment variable, `ACCEPT_IAC`, with any combination of `tf,yaml,yml,json,tpl`.
* `ACCEPT_CODE` - by default, Snyk Code will not load code snippets. To enable code snippets you can add an environment variable, `ACCEPT_CODE=true`.
* `ACCEPT_ESSENTIALS` - Enable Snyk Essentials to identify your application assets, monitor them, and prioritize the risks. You can enable it by adding an environment variable `ACCEPT_ESSENTIALS=true`.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/azure-repos-prerequisites-and-steps-to-install-and-configure-broker/azure-repos-install-and-configure-and-configure-using-helm.md
# Azure Repos - install and configure and configure using Helm
{% hint style="info" %}
**Feature availability**
Snyk Azure Repos are available only for Azure DevOps/TFS 2020 or above.
{% endhint %}
Before installing, review the [prerequisites](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/azure-repos-prerequisites-and-steps-to-install-and-configure-broker) and the general instructions for installation using [Helm](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/install-and-configure-broker-using-helm).
To use this chart, you must first add the Snyk Broker Helm Chart by adding the repo:
`helm repo add snyk-broker https://snyk.github.io/snyk-broker-helm/`
Then, run the following commands to install the Broker and customize the environment variables. Refer to [Azure Repos - environment variables for Snyk Broker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/azure-repos-prerequisites-and-steps-to-install-and-configure-broker/azure-repos-environment-variables-for-snyk-broker) for definitions of the environment variables.
For the variable `azureReposHost` the value does not include `https://`. If you have more than one Azure organization, you must deploy a Broker for each one.
Snyk Essentials is set by default to `false`. Enable it by setting the flag to `true`.
{% hint style="info" %}
**Multi-tenant settings for regions**\
When installing, you must add a command in your script to set the `brokerServerUrl`. This is the URL of the Broker server for the region where your data is hosted. For the commands and URLs to use, see [Broker URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#broker-server-urls).
{% endhint %}
```
helm install snyk-broker-chart snyk-broker/snyk-broker \
--set scmType=azure-repos \
--set brokerToken= \
--set brokerServerUrl=
--set azureReposToken= \
--set azureReposOrg= \
--set azureReposHost= \
--set brokerClientUrl=: \
--set enableEssentials=true \
-n snyk-broker --create-namespace
```
You can pass any environment variable of your choice in the Helm command. For details, see [Custom additional options for Broker Helm Chart](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-helm-chart-installation/custom-additional-options-for-broker-helm-chart-installation). Follow the instructions for [Advanced configuration for Helm Chart installation](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-helm-chart-installation) to make configuration changes as needed.
You can verify that the Broker is running by looking at the settings for your brokered integration in the Snyk Web UI to see a confirmation message that you are connected. You can start importing Projects once you are connected.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/azure-repos-prerequisites-and-steps-to-install-and-configure-broker/azure-repos-install-and-configure-using-docker.md
# Azure Repos - install and configure using Docker
{% hint style="info" %}
**Feature availability**
Snyk Azure Repos are available only for Azure DevOps/TFS 2020 or above.
{% endhint %}
Before installing, review the [prerequisites](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/azure-repos-prerequisites-and-steps-to-install-and-configure-broker) and the general instructions for installation using [Docker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/install-and-configure-broker-using-docker).
This integration is useful to ensure a secure connection with your on-premise or cloud Azure Repos deployment.
## Configure Broker to be used with Azure Repos
To use the Broker Client with [Azure](https://azure.microsoft.com/en-us/services/devops/), run `docker pull snyk/broker:azure-repos`. Refer to [Azure Repos - environment variables for Snyk Broker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/azure-repos-prerequisites-and-steps-to-install-and-configure-broker/azure-repos-environment-variables-for-snyk-broker) for definitions of the environment variables.
If necessary, go to the [Advanced configuration page](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-snyk-broker-docker-installation) and make any configuration changes needed, such as providing the CA (Certificate Authority) to the Broker Client configuration if the Azure Repos instance is using a private certificate, and setting up [proxy support](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-snyk-broker-docker-installation/proxy-support-with-docker).
## Docker run command to set up a Broker Client for Azure Repos
Copy the following command to set up a fully configured Broker Client to analyze Open Source, IaC, Container, Code files, and Snyk Essentials information for one Azure organization. Enable [Snyk Essentials](https://docs.snyk.io/scan-with-snyk/snyk-essentials) to identify your application assets, monitor them, and prioritize the risks.
{% hint style="info" %}
**Multi-tenant settings for regions**\
When installing, you must add a command in your script to set the `BROKER_SERVER_URL`. This is the URL of the Broker server for the region where your data is hosted. For the commands and URLs to use, see [Broker URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#broker-server-urls).
{% endhint %}
Note that if you have more than one Azure organization, you must deploy a Broker for each one. Snyk Essentials is set by default to `false`. Enable it by setting the flag to `true`.
```bash
docker run --restart=always \
-p 8000:8000 \
-e BROKER_TOKEN= \
-e BROKER_SERVER_URL= \
-e AZURE_REPOS_TOKEN= \
-e AZURE_REPOS_ORG= \
-e AZURE_REPOS_HOST= \
-e PORT=8000 \
-e BROKER_CLIENT_URL= \
-e ACCEPT_IAC=tf,yaml,yml,json,tpl \
-e ACCEPT_CODE=true \
-e ACCEPT_ESSENTIALS=true \
snyk/broker:azure-repos
```
## Start the Broker Client container and verify the connection with Azure Repos
Paste the Broker Client configuration to start the Broker Client container.
Once the container is up, the Azure Repos Integrations page shows the connection to Azure Repos and you can `Add Projects.`
## Basic troubleshooting for Broker with Azure Repos
* Run `docker logs ` to look for any errors, where `container id` is the Azure Repos Broker container ID.
* Ensure relevant ports are exposed to Azure Repos.
* Make sure that file permissions for the local path to the `accept.json` file, as well as the `accept.json` file itself, are correct and accessible.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/azure-repos-prerequisites-and-steps-to-install-and-configure-broker.md
# Azure Repos - prerequisites and steps to install and configure Broker
{% hint style="info" %}
**Feature availability**
Snyk Azure Repos are available only for Azure DevOps/TFS 2020 or above.
{% endhint %}
Before installing, review the general instructions for the installation method you plan to use, [Helm](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/install-and-configure-broker-using-helm) or [Docker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/install-and-configure-broker-using-docker).
Before installing the Snyk Azure Repos Broker, ask your Snyk account team to provide you with a Broker token.
You must have Docker or a way to run Docker Linux containers. Some Docker deployments for Windows run only Windows containers. Ensure that your deployment is capable of running Linux containers.
Continue with the steps to install using [Docker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/azure-repos-prerequisites-and-steps-to-install-and-configure-broker/azure-repos-install-and-configure-using-docker) or [Helm](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/azure-repos-prerequisites-and-steps-to-install-and-configure-broker/azure-repos-install-and-configure-and-configure-using-helm).
---
# Source: https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations/azure-repositories-tfs.md
# Azure Repositories (TFS)
{% hint style="info" %}
**Feature availability**\
Integration with Azure DevOps Server 2020 and above, also known as TFS, is available only with Enterprise plans. For more information, see [plans and pricing](https://snyk.io/plans/).
Snyk supports only Git. Snyk does not currently support integration with Team Foundation Version Control (TFVC).
{% endhint %}
### Prerequisites for Azure Repositories (TFS) integration
* [Snyk Organization Admin](https://docs.snyk.io/snyk-platform-administration/user-roles/pre-defined-roles) user role.
* An Azure project. If you do not have a project yet, create one in [Azure DevOps](https://docs.microsoft.com/en-us/azure/devops/user-guide/sign-up-invite-teammates?view=azure-devops) or set one up in an [on-premise Azure DevOps](https://docs.microsoft.com/en-us/azure/devops/organizations/projects/create-project?view=azure-devops) instance.
* The required Personal Access Token (PAT) access scopes. For details on the permissions required, see [Azure Repositories (TFS) permission requirements](https://docs.snyk.io/developer-tools/user-permissions-and-access-scopes#azure-repositories-tfs-permission-requirements).
### Azure Repositories (TFS) integration features
Snyk integrates with your Microsoft Azure Repository to let you import Projects and monitor the source code for your repositories. Snyk tests the Projects you have imported for known security vulnerabilities in the dependencies, testing at a frequency you control.
The Azure Repository integration lets you:
* Continuously perform security scanning across all the integrated repositories
* Detect vulnerabilities in your open-source components
* Provide automated fixes and upgrades
After the integration is configured, Snyk does the following:
1. Evaluates the items you selected and imports the ones that have the relevant manifest files in their root folder and all the subfolders at any level.
2. Communicates directly with your repository for each test it runs using the permissions you associated with your PAT, to determine exactly which code is currently pushed by the Snyk application and which dependencies are being used. Each dependency is tested against the Snyk vulnerability database to see if it contains any known vulnerabilities.
3. Notifies you by email or a dedicated Slack channel if vulnerabilities are found according to the preferences you configured, so that you can take immediate action to fix the issues.
### How to set up the Azure Repositories (TFS) integration
The process to connect Snyk with your Azure repositories includes the following steps:
1. Generate a unique Azure DevOps personal access token (PAT) for Snyk, based on a username and password combination, and configured with the specific permissions Snyk needs to access your Azure repositories. For more information, see [Configure a Personal Access Token (PAT)](#configure-a-personal-access-token-pat).
2. [Enable the integration through the Snyk Web UI](#integrate-using-the-snyk-web-ui).
3. [Select the Projects and repositories](#add-projects-to-snyk-for-azure-repos) you want to associate with Snyk for testing and monitoring.\
You can also enter custom file locations for any manifest files that are not located in the root folders of your repositories.
### Configure a Personal Access Token (PAT)
Generate and copy a unique PAT to use for Snyk. For more information on the PAT access scope requirements to enable in Azure, see [Azure Repositories (TFS) permission requirements](https://docs.snyk.io/developer-tools/user-permissions-and-access-scopes#azure-repositories-tfs-permission-requirements).
### Integrate using the Snyk Web UI
1. Log in to [your Snyk account](https://app.snyk.io) and navigate to **Integrations**.
2. On the **Azure Repos** tile, click the settings icon to open **Organization Settings** > **Integrations** > **Azure Repos** > **Account credentials**.
3. Pay special attention to the instructions given on the **Account Credentials** page. Depending on your plan, you may need to enter just the Azure DevOps Organization, or you may need to enter the entire URL.
* **Set your organization**: Enter the slug for your Organization only.\
For example, enter `your-azure-devops-org`
* **Set your host**: enter the entire url.\
For example, enter `https://dev.azure.com/your-azure-devops-org`\
Alternatively, you may enter a custom url that is publicly reachable
4. Click **Save**, and then enter the PAT that you generated.
5. Click **Save**.\
Snyk tests the connection values and the page reloads, displaying the Azure Repos integration information. A message to confirm that the details were updated appears at the top of the screen.
{% hint style="info" %}
If the connection to Azure fails, a notification will appear under the **Azure Repos** card title.
{% endhint %}
### Add Projects to Snyk for Azure Repos
Snyk tests and monitors Azure Repos by evaluating root folders and custom file locations for the [languages that Snyk supports](https://docs.snyk.io/supported-languages/supported-languages-package-managers-and-frameworks).
To add a default Project:
1. In Snyk, navigate to **Projects** > **Add projects**.
2. Choose the relevant repository or tool from which to import your projects.\
The available repositories for the integration you chose are displayed in a new window.
3. Select the repositories that you want Snyk to monitor for security and license issues.
4. To import all the repos for a specific Organization, check the **Organization**.
5. Click **Add selected repositories**.\
Snyk scans the entire file tree for dependency files and imports them to Snyk as Projects.
### Adding custom file locations and excluding folders
#### Add a custom file location
Use this procedure to add an Azure Repository dependency from a non-default path.
1. In Snyk, navigate to **Projects** > **Add projects** > **Azure repos** > **Settings**.
2. Open the **Add custom file location (optional)** list and **select a repository...** to configure a custom path.
3. In the text field, enter the `relative path for the manifest file location`.
{% hint style="warning" %}
The relative path field is case-sensitive.
{% endhint %}
#### Exclude folders from import
The optional **Exclude folders** field is case-sensitive. The pattern you enter is applied to all the Azure repositories.
### **Confirming the repository import**
After repositories are imported, a confirmation appears in green at the top of the screen. The selected files are marked with a unique icon and named by Organization and repository. You can filter to view only those Projects by selecting the Azure Repos filter option.
{% hint style="info" %}
The Azure Repository integration works like the other [Snyk SCM integrations](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations). To continue to monitor, fix, and manage your Projects, see the [Projects](https://docs.snyk.io/snyk-platform-administration/snyk-projects) documentation.
{% endhint %}
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-iac/detect-manually-created-resources/supported-resources/azure-resources.md
# Azure resources
Snyk IaC unmanaged resource scanning supports the following resources for Azure:
| **Resource** |
| ------------------------------------ |
| azurerm\_storage\_account |
| azurerm\_storage\_container |
| azurerm\_virtual\_network |
| azurerm\_route\_table |
| azurerm\_route |
| azurerm\_resource\_group |
| azurerm\_subnet |
| azurerm\_container\_registry |
| azurerm\_firewall |
| azurerm\_postgresql\_server |
| azurerm\_postgresql\_database |
| azurerm\_public\_ip |
| azurerm\_network\_security\_group |
| azurerm\_lb |
| azurerm\_lb\_rule |
| azurerm\_private\_dns\_zone |
| azurerm\_private\_dns\_a\_record |
| azurerm\_private\_dns\_aaaa\_record |
| azurerm\_private\_dns\_cname\_record |
| azurerm\_private\_dns\_ptr\_record |
| azurerm\_private\_dns\_mx\_record |
| azurerm\_private\_dns\_txt\_record |
| azurerm\_private\_dns\_srv\_record |
| azurerm\_image |
| azurerm\_ssh\_public\_key |
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-snyk-broker-docker-installation/backend-requests-with-an-internal-certificate-for-docker.md
# Backend requests with an internal certificate for Docker
By default, the Broker Client establishes HTTPS connections to the backend system: GitHub, BitBucket, Jira, or other. If your backend system is serving an internal certificate (signed by your own certificate authority (CA)), you can provide the CA certificate to the Broker Client.
For example, if your CA certificate is at `./private/ca.cert.pem`, provide it to the Docker container by mounting the folder and using the `NODE_EXTRA_CA_CERT` environment variable. See the following example for Bitbucket:
```
docker run --restart=always \
-p 8000:8000 \
-e BROKER_TOKEN=secret-broker-token \
-e BITBUCKET_USERNAME=username \
-e BITBUCKET_PASSWORD=password \
-e BITBUCKET=your.bitbucket-server.domain.com \
-e BITBUCKET_API=your.bitbucket-server.domain.com/rest/api/1.0 \
-e PORT=8000 \
-e NODE_EXTRA_CA_CERTS=/private/ca.cert.pem \
-v /local/path/to/private:/private \
snyk/broker:bitbucket-server
```
Beginning with [Broker version 4.166.0 (2023-10-10)](https://github.com/snyk/broker/releases/tag/v4.166.0), the custom CA cert instruction is `NODE_EXTRA_CA_CERTS` and this must be set as shown in order to use a custom CA. The `CA_CERT` environment variable is no longer in use for this purpose.
Note that this completely replaces the default CA Certificate List for any requests made to your backend system, so this must be the complete chain required by the certificate used by the backend system.
It must be `PEM`-formatted; `DER` is not supported. Supported certificate types are:
* `TRUSTED CERTIFICATE`
* `X509 CERTIFICATE`
* `CERTIFICATE`
An example follows.
```
-----BEGIN CERTIFICATE-----
-----END CERTIFICATE----
-----BEGIN CERTIFICATE-----
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
-----END CERTIFICATE-----
```
---
# Source: https://docs.snyk.io/developer-tools/scm-integrations/application-context-for-scm-integrations/backstage-file-in-asset-inventory-use-case.md
# Backstage file in Asset Inventory - use case
After you finish configuring the [Backstage catalog](https://docs.snyk.io/developer-tools/scm-integrations/application-context-for-scm-integrations/..#backstage-file-for-scm-integrations), Snyk Essentials starts enriching your repository assets (the [All Assets](https://docs.snyk.io/manage-assets/manage-assets#inventory-menu) tab from the Inventory layout) with the data found in the backstage `catalog-info.yaml` file.
Use the backstage catalog to enrich the repository assets and to define the component entity. For this type of situation, a component is defined as a software component, like a service, repository, website, library, and so on.
Components have several attributes and most of them are optional:
* `spec.type` (mandatory) - represents the classification of the repository.
* `spec.owner` (mandatory) - represents the team owning the repository.
* `spec.lifecycle` - represents the lifecycle state of the component, for example `production`, `experimental`, `deprecated`.
* `spec.system` (optional) - represents a group of components that serve the same purpose. This concept is referred to as “Application”.
* `Metadata.name` (mandatory) - represents the name of the component.
* `Metadata.title` (optional) - represents the name of the component.
The backstage data is dynamic and may change over time:
* If new commits or updates are made to the `catalog-info.yaml` file, then Snyk Essentials updates the asset attribute for that specific repository asset.
* If the`catalog-info.yaml` file is removed from the repository, then Snyk Essentials deletes the asset attribute from that specific repository assets.
{% hint style="info" %}
You can use quotes (`""`) to escape keys that contain periods (`.`), for example`"`[`example.com`](http://example.com/)`".owner`.
{% endhint %}
## Inventory menu and the backstage file
Depending on the selection you made on the Integration configuration menu, only those selections are displayed in filters from the Inventory menu. For example, if you selected the Category attribute, then it will also be displayed in the filters list.
## Asset Summary tab and the backstage file
The Asset Summary tab shows the six backstage attributes that are configured in the Integrations page only if you choose to integrate with Backstage.
## Asset Attributes tab and the backstage file
In the Asset Attributes tab only the selected attributes should be added as metadata to the repository asset.
```
{
name:"spring.goof",
repositoryURL:"https://github.com/snyk/spring.goof.git",
context:[
{
name: "super-duper-component",
title: "Super Duper Component",
application: "super-duper-app",
lifecycle: "production",
owner: "super-duper-team",
category: "service",
source: "Backstage"
}]
}
```
## Policies filter and the backstage file
In the policy builder you can find only the attributes you have previously selected when configuring the backstage catalog file.
The following list describes all possible backstage attributes that you can choose from when you configure the backstage catalog file.
* **Application** - represents a group of components that serve the same purpose.
* **Owner** - specifies the team owning the repository.
* **Catalog name** - the metadata name.
* **Title** - a name to display for the entity instead of the property. It is an alternative to the metadata name, when the catalog name is too hard to read.
* **Category** - represents the classification of the repository. The Organization can choose any name or text.
* **Lifecycle** - specifies the lifecycle state of the component, for example production, experimental, deprecated.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/universal-broker/basic-steps-to-install-and-configure-universal-broker.md
# Basic steps to install and configure Universal Broker
{% hint style="info" %}
You can learn more about Universal Broker in the dedicated Snyk Learn course. Explore the advantages, configuration, architecture, and much more with [Snyk Learn: Universal Broker](https://learn.snyk.io/lesson/universal-broker/?ecosystem=general).
{% endhint %}
Follow these steps to install and configure your Universal Broker using the `snyk-broker-config` CLI tool. The tool guides you through the steps and indicates important points in the workflows.
## Install and set up the snyk-broker-config CLI tool
To install the tool, use `npm i -g snyk-broker-config`.
{% hint style="info" %}
The `snyk-broker-config` CLI tool is your primary guide for setting up and managing Snyk Broker connections. It is designed to walk you step-by-step through the process, asking for the required information for each integration type. This adaptive method guarantees that you receive the latest parameter requirements directly from the tool.
{% endhint %}
The basic process for configuring a new Universal Broker deployment is as follows:
1. Install the `snyk-broker-config` tool by running `npm i -g snyk-broker-config`
2. Create a Universal Broker new connection by following the create workflow `snyk-broker-config workflows connections create`
3. Integrate your new Universal Broker Connection by following the integrate workflow `snyk-broker-config workflows connections integrate`
### How to use the CLI for parameter discovery
Even if you are not running the full setup, the `snyk-broker-config` CLI tool can help you understand the required parameters. Use the interactive setup or the command line help.
* Interactive setup: Run `snyk-broker-config workflows connections create` and follow the prompts. The CLI asks for all required and optional parameters based on the integration type you select, such as GitLab, Artifactory, or Bitbucket Server.
* Command line help: Use the `--help` flag for any `snyk-broker-config` command to see available options and parameters. For instance, to see parameters specifically relevant for creating a GitLab connection type, use the following command:
```bash
snyk-broker-config workflows connections create --type gitlab --help
```
### Example of a connection workflow
A typical workflow for adding a new Broker connection using the CLI involves these steps:
1. Install the CLI:
```bash
npm install -g snyk-broker-config
```
2. Start the interactive workflow:
```bash
snyk-broker-config workflows connections create
```
The CLI will then guide you through the process, prompting for:
* Your Snyk API Token, required for authentication if you did not already export it as an environment variable.
* The Snyk Organization ID where the Broker connection is used.
* The specific type of integration you want to connect, such as `gitlab`, `artifactory`, `bitbucket-server`.
* All required and optional parameters, such as URLs, tokens, usernames, or passwords, are dynamically identified for your chosen integration type. Follow the on-screen instructions carefully.
The interactive workflow is the most straightforward way to ensure all necessary parameters are correctly provided for your Broker connection.
## Create your first connection
This returns an install ID, a client ID, and a client secret, which are all needed to interact with the Snyk platform.
* After you install, start the Universal Broker Create Connection workflow.
```
> snyk-broker-config workflows connections create
Using https;//api.snyk.io (or https://api.REGION.snyk.io)
Universal Broker Create Connection workflow
Enter your Snyk Token
```
* Type your Snyk API token and press Enter.
```
✓ Valid Snyk Token.
✓ Tenant Admin role confirmed.
Have you installed the Broker App against an Org? (Y/N)
```
* Type N and press Enter.
```
Enter Org ID to install Broker App. Must be in Tenant .
(Must be a valid uuid).
```
* Paste the Snyk Broker Admin Organization ID created in the prerequisites and press Enter.
The Broker App facilitates the secure connection and communication with the Broker server through OAuth.
```
App installed. Please store the following credentials securely:
- client id:
- ClientSecret:
You will need them to run your Broker Client.
Have you saved these credentials? (Y/N)
```
The tool displays the credentials for the Broker App just installed. Be sure to store these safely like any other credentials. This is the only time the client secret will be displayed. If you lose these credentials, you must either delete and recreate the Snyk Broker Admin Organization and start over, or use the API to reinstall Universal Broker manually.
* When you have saved your credentials, type Y and press Enter.
```
Helpful tip ! Set TENANT_ID, INSTALL_ID as environment values to avoid pasting
the values in for every command.
Now using Tenant ID and Install ID .
Do you want to create a new Deployment? (Y/N)
```
Snyk recommends that you set the `INSTALL_ID` as an environment variable to improve usability:
* `export INSTALL_ID=zzzz` (Linux/Mac)
* `set INSTALL_ID=zzzz` (Windows)
## Input connection parameters
This includes creating credentials references needed for your connections. Each deployment is limited to a maximum of 25 connections.
* In response to the prompt, type Y and press Enter.
```
Which Connection type do you want to create?
acr
artifactory
…
> github
…
```
* Select the connection type you want to create.
This example shows creating a GitHub connection. Creating all the other types of connection follows the same process. Each deployment is limited to 25 connections.
```
Enter a human-friendly name for your Connection.
```
* Enter a connection name to help you identify the connection, for example, github-connection-for-team-x.
```
Enter a human-friendly name for your Connection.
broker_client_url: Broker client url. Must be url.
```
* Enter your Broker\_client\_url. Snyk recommends using the default value. You can enter a different value, which is required for container integrations.
```
broker_client_url: Broker client url. Must be url.
github_token (Sensitive): No existing Credential Reference for this Connection type.
CreateNew
Env Var Name (e.g., MY_GITHUB_TOKEN). (Must be a valid envvar).
```
* Create the credential reference (not the actual credential value). Enter the name of the environment variable which will contain the actual credential value when the Broker client is running, for example, MY\_GITHUB\_TOKEN.
* Optionally, you can enter a comment to help you keep track of this connection.
```
Env Var Name (e.g. MY_GITHUB_TOKEN). (Must be a valid envvar).
Comment this is a demo broker connection.
```
When you run the Broker client container in a subsequent step, you must add the `-e MY_GITHUB_TOKEN=`. In a production setup, these values are mounted from the secrets vault.
```
Connection created with ID . Ready to configure integrations
to use this connection.
Connection Create workflow completed.
```
The connection is now created.
## Validate your connection (optional)
You can use the following workflow to display details about the connection.
```
> snyk-broker-config workflows connections get
```
* If you are prompted about whether the Broker app is installed, enter Y and then paste the install ID you saved previously.
Exporting the INSTALL\_ID avoids this optional step in your terminal session in the future. The deployment details follow.
```
Now using Deployment .
Selected Connection ID .
Ready to configure integrations to use this connection.
```
Details of the connection follow: `connection ID`; `connection type (broker_connection)`; `attributes: deployment_id, identifier, name, and secrets-primary and secondary`, each with the `status`, `encrypted`, `expires_at`, and `nonce`; `configuration required: broker-client-url` and `github_token values`; `type: github.`
```
Connection Detail Workflow completed.
```
## Integrate your connection with an Organization that will use the Universal Broker
```
> snyk-broker-config workflows connections integrate
Enter the OrgID you want to integrate.. (Must be a valid uuid).
```
* Enter the ID of the Organization where you want to use the newly created Broker connection.
```
Enter the OrgID you want to integrate.. (Must be a valid uuid).
Connection (type:github) integrated with .
Connection Integration Workflow completed
```
Your Organization is now integrated with your new Broker connection.
## Run the Broker client
```
docker run -d --restart=always \
-p 8000:8000 \
-e DEPLOYMENT_ID= \
-e CLIENT_ID= \
-e CLIENT_SECRET= \
-e MY_GITHUB_TOKEN= \
-e PORT=8000 \
snyk/broker:universal
```
When the Broker client has started, the connection is ready to use, in this case, to import repositories.
* To verify that your connection is configured, check that the integration tile on your **Organization Settings** > **Integrations** page is marked **Configured**.
## Integrate your connection with more Organizations
To integrate your connection with another Organization, run the `integrate` command again and enter the new Organization ID. Repeat as needed to connect with multiple Organizations.
```
> snyk-broker-config workflows connections integrate
```
Repeat the step for any Organization in your Tenant as needed, for as many integrations as you need.
---
# Source: https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations/bitbucket-cloud-app.md
# Bitbucket Cloud App
The Bitbucket Cloud App is positioned to be the default Bitbucket Cloud integration
The Bitbucket Cloud App integration lets you connect your Snyk Organization to a Bitbucket Cloud Workspace and get all Snyk's core SCM integration features:
* Continuously perform security scanning across all the integrated repositories
* Detect vulnerabilities in your Open Source components
* Provide automated fixes and upgrades
* Provides developer teams with first-party visibility for security issues directly in the Bitbucket interface
{% hint style="info" %}
Snyk recommends using the Bitbucket Cloud App integration for smoother integration and to ensure long-term support.
If you are using the [Bitbucket Cloud API token integration](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations/bitbucket-cloud), see [Migrate a Bitbucket Cloud integration](https://docs.snyk.io/developer-tools/scm-integrations/bitbucket-cloud#migrate-to-the-bitbucket-cloud-app) for more information.
{% endhint %}
### Setting up a Bitbucket Cloud App
To give Snyk access to your Bitbucket account, you need to install the Snyk App on your Bitbucket Cloud workspace.
{% hint style="info" %}
To install the Snyk App on your Bitbucket Cloud workspace, you must have **Admin** permissions for the Workspace in Bitbucket.
{% endhint %}
1. In Snyk, navigate to **Integrations (Source control),** then **Bitbucket Cloud App** tile, and click **Connect** to install the Snyk Bitbucket Cloud App on your Bitbucket Cloud workspace.
2. In the new Bitbucket tab, select the relevant workspace to connect to your Snyk Organization from the list and [**Grant access** to let Snyk](https://docs.snyk.io/developer-tools/user-permissions-and-access-scopes#bitbucket-cloud-app-scopes):
* Read your account information
* Read and modify your repositories and their pull requests
* Read and modify your repositories' webhooks
Allow access for Snyk to Bitbucket Cloud
3. Grant access to your Snyk Organization when you're prompted.
Allow Bitbucket Cloud access to your Snyk Organization
After you allow access to the Snyk Organization, the Snyk **Organization Settings** page opens and confirms that the Bitbucket Cloud App is connected.
After Snyk is integrated with Bitbucket Cloud, you can see the new Snyk security tab on the repository page and import and explore the issues and vulnerabilities for your repository Projects directly in Bitbucket.
Bitbucket security insights with Snyk Bitbucket Cloud App
Watch this short video to see how to set up **Snyk security** in Bitbucket Cloud.
{% embed url="" %}
Set up Snyk security in Bitbucket Cloud
{% endembed %}
### Installing the Snyk App from Bitbucket Cloud
If you need to, you can also install the Snyk Bitbucket Cloud App integration while you are in Bitbucket Cloud.
In one of your Bitbucket Cloud workspaces, navigate to the **Security** tab in one of your repositories, click **Try now**, and follow the procedure.
Install the Snyk Bitbucket Cloud App from Bitbucket
### Adding Bitbucket repositories to Snyk
After you connect Snyk to your Bitbucket Cloud account, you can select repositories for Snyk to monitor.
1. In Snyk, navigate to **Integrations,** then **Bitbucket Cloud App** card and click to start importing repositories to Snyk.
2. Choose the repositories you want to import to Snyk and click **Add selected repositories**.
After you add them, Snyk scans the selected repositories for dependency files in the entire directory tree, that is, `package.json`, `pom.xml`, and so on, and imports them to Snyk as Projects.
The imported Projects appear on your **Projects** page and are continuously checked for vulnerabilities.
### Bitbucket integration features
After the integration is in place, you can use capabilities such as:
* [Project-level security reports](#project-level-security-reports)
* [Pull request testing](#pull-request-tests)
* [First-party interface in Bitbucket Cloud](#first-party-interface-in-bitbucket-cloud)
#### Project-level security reports
Snyk produces advanced [security reports](https://docs.snyk.io/manage-risk/reporting/legacy-reports/legacy-reports-overview) that let you explore the vulnerabilities found in your repositories, and fix them immediately by opening a fix pull request directly to your repository, with the required upgrades or patches.
The example that follows shows a Project-level security report.
Project-level security report
Snyk scans your Projects on either a daily or a weekly basis. When new vulnerabilities are found, Snyk notifies you by email and by opening [automated pull requests](https://docs.snyk.io/scan-with-snyk/pull-requests/snyk-pull-or-merge-requests/create-automatic-prs-for-new-fixes-fix-prs) with fixes for your repositories.
The example that follows shows a fix pull request opened by Snyk.
Fix pull request opened by Snyk
To review and adjust the automatic fix pull request settings:
1. In Snyk, go to **Organization settings** > **Integrations** > **Source control** > **Bitbucket Cloud App**, and click **Edit Settings**.
2. Scroll to the **Automatic fix PRs** section and configure the relevant options.
Automatic fix PR settings
{% hint style="info" %}
Unlike manual pull requests opened from the Bitbucket interface, Snyk pull requests are *not* automatically assigned to the default reviewer set in your Bitbucket Cloud account.
For more information, see [Automated pull request creation for new fixes](https://docs.snyk.io/scan-with-snyk/pull-requests/snyk-pull-or-merge-requests/create-automatic-prs-for-new-fixes-fix-prs).
{% endhint %}
#### Pull request tests
Snyk tests any newly created pull request in your repositories for security vulnerabilities and sends a build check to Bitbucket Cloud. You can see directly from Bitbucket Cloud whether or not the pull request introduces new security issues.
The example that follows shows a Snyk pull request build check on the Bitbucket Cloud **Pull Request** page.
BitBucket Cloud pull request page showing Snyk pull request
To review and adjust the pull request test settings, follow these steps:
1. In Snyk, go to Organization settings > **Integrations > Source control** > **Bitbucket Cloud App**, and click **Edit Settings**.
2. Scroll to **Default Snyk test for pull requests > Open Source Security & Licenses**, and configure the relevant options. See [Configure PR Checks](https://docs.snyk.io/scan-with-snyk/pull-requests/pull-request-checks/configure-pull-request-checks) for more details.
#### First-party interface in Bitbucket Cloud
When you install the Snyk Bitbucket Cloud App integration in your Bitbucket workspace, the members of your workspace can import repositories and see the security issues related to their repositories in a dedicated Snyk security tab in Bitbucket Cloud.
Snyk Security tab in Bitbucket Cloud
{% hint style="warning" %}
The first-party interface currently supports only the [Snyk Open Source](https://docs.snyk.io/scan-with-snyk/snyk-open-source) and [Snyk Container](https://docs.snyk.io/scan-with-snyk/snyk-container) products. Issues from other Snyk products do not show up on this page.
{% endhint %}
You can **associate a first-party interface with a different Snyk account or Organization**.
During the first-time Bitbucket Cloud App onboarding process, the first-party interface is associated with a specific Snyk Organization. This is the Snyk Organization to which Bitbucket users will import repositories and for which they will view Snyk issues.
To change the Snyk Organization after onboarding, go to the workspace settings > **Security for Bitbucket Cloud** > **Integration Settings** and click **Connect via a different Snyk user/organization**.
Create integration settings for a different Organization
The installation process begins again, and you can choose the relevant Snyk Organization.
### Required permission scopes for the Bitbucket Cloud App integration
For detailed information on the permissions required for this integration, see [Bitbucket permission requirements](https://docs.snyk.io/developer-tools/user-permissions-and-access-scopes#bitbucket-cloud-app-scopes).
### Disabling the Bitbucket Cloud App integration
{% hint style="warning" %}
When you disconnect Snyk from your repository Projects, your credentials are removed from Snyk, and any integration-specific Projects that Snyk is monitoring are deactivated in Snyk.
If you choose to re-enable this integration later, you must re-enter your credentials and activate your Projects.
{% endhint %}
To disable this integration, in **Organization settings** > **Integrations** > **Source Control** > **Bitbucket Cloud App:**
1. In your list of integrations, select the Bitbucket Cloud App integration you want to deactivate and click **Edit settings** to open a page with the current status of your integration.\
\
The page includes sections that are specific to each integration, where you can manage your credentials, API key, Service Principal, or connection details.
2. Scroll to the **Disconnect** section and click **Remove** to remove the integration.
{% hint style="info" %}
Disconnecting the integration from the Snyk side does not uninstall the app from your workspace in Bitbucket Cloud. To uninstall the Bitbucket app, navigate to your workspace settings in Bitbucket.org --> Installed Apps and remove the Snyk Security for Bitbucket Cloud app.
{% endhint %}
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/bitbucket-cloud/bitbucket-cloud-examples.md
# Bitbucket Cloud - Examples
The following options are available for the `snyk-scm-contributors-count bitbucket-cloud` command:
```
--version Show version number [boolean]
--help Show help [boolean]
--user Bitbucket cloud username [required]
--password Bitbucket cloud app password [required]
--workspaces [Optional] Bitbucket cloud workspace name/uuid to count contributors for
--repo [Optional] Specific repo to count only for
--exclusionFilePath [Optional] Exclusion list filepath
--json [Optional] JSON output, requiered when using the "consolidateResults" command
--skipSnykMonitoredRepos [Optional] Skip Snyk monitored repos and count contributors for all repos
--importConfDir [Optional] Generate an import file with the unmonitored repos: A path to a valid folder for the generated import files
--importFileRepoType [Optional] To be used with the importConfDir flag: Specify the type of repos to be added to the import file. Options: all/private/public. Default: all
```
## Before running the command
1. Export SNYK\_TOKEN (if you want to get the contributors only for repos that are already monitored by Snyk):
* Make sure that your token has Group level access or use a service account's token that has Group level access. To learn more about how to create a service account, refer to [Service accounts](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts).
* Copy the token value.
* Export the token in your environment:
```
export SNYK_TOKEN=
```
2. Get your Bitbucket Cloud username (**not email**) and [app password](https://developer.atlassian.com/cloud/bitbucket/rest/intro/#authentication)
**Note**: Make sure your credentials have read access to the repos.
## Running the command
Consider the following levels of usage and options:
### Usage levels
* To get commits for all workspaces and their repos in Bitbucket Cloud, provide the Bitbucket Cloud user and app password:
```
snyk-scm-contributors-count bitbucket-cloud --user USERNAME --password APP_PASSWORD
```
* To get commits for some workspaces and their repos in Bitbucket Cloud, provide the Bitbucket Cloud user, Bitbucket Cloud app password, and a comma-separated list of workspaces:
```
snyk-scm-contributors-count bitbucket-cloud --user USERNAME --password APP_PASSWORD --workspaces Workspace1,Workspace2...
```
* To get commits for a specific repo in Bitbucket Cloud, provide the Bitbucket Cloud user, Bitbucket Cloud app password, a workspace, and a repo name:
```
snyk-scm-contributors-count bitbucket-cloud --user USERNAME --password APP_PASSWORD --workspaces Workspace1 --repo Repo1
```
### Options
* To get all the commits from Bitbucket Cloud regardless of the repos that are already monitored by Snyk, add the `--skipSnykMonitoredRepos` flag.\
You might have repos in Bitbucket Cloud that are not monitored in Snyk; use this flag to skip checking for Snyk monitored repos and go directly to Bitbucket Cloud to fetch the commits.
```
snyk-scm-contributors-count bitbucket-cloud --user USERNAME --password APP_PASSWORD --skipSnykMonitoredRepos
```
* To exclude some contributors from being counted in the commits , add an exclusion file with the emails to ignore (separated by a new line),and apply the `--exclusionFilePath` with the path to that file:
```
snyk-scm-contributors-count bitbucket-cloud --user USERNAME --password APP_PASSWORD --workspaces Workspace1,Workspace2 --exclusionFilePath PATH_TO_FILE
```
* To set the output to json format: add the `--json` flag:
```
snyk-scm-contributors-count bitbucket-cloud --user USERNAME --password APP_PASSWORD --workspaces Workspace1 --repo Repo1 --json
```
* To create an import file for your unmonitored repos, add the `--importConfDir` flag with a valid (writable) path to a folder in which the import files will be stored, and add the `--importFileRepoType` flag (optional) with the repo types to add to the file (`all`/`private`/`public`, defaults to `all`). Note that these flags **can not** be set with the `--repo` flag.
```
snyk-scm-contributors-count bitbucket-cloud --user USERNAME --password APP_PASSWORD --importConfDir ValidPathToFolder --importFileRepoType private/public/all
```
For more information about these flags, refer to this [Creating and using the import page](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/creating-and-using-the-import-file).
* To run in debug mode for verbose output, prefix with `DEBUG=snyk*`:
```
DEBUG=snyk* snyk-scm-contributors-count bitbucket-cloud --user USERNAME --password APP_PASSWORD --workspaces Workspace1 --repo Repo1 --exclusionFilePath PATH_TO_FILE --skipSnykMonitoredRepos --json
.
```
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/bitbucket-cloud/bitbucket-cloud-flow-and-tech.md
# Bitbucket Cloud - Flow and Tech
## Flow
1. Fetch the monitored projects from Snyk (if the `skipSnykMonitoredRepos` flag was **not set** and the `SNYK_TOKEN` was exported).
2. Fetch `one`/`some`/`all` the workspaces that the credentials have access to from SCM and create a workspaces list.
3. Fetch `one`/`all` repos under the fetched/provided workspaces.
4. Remove the repos that are not monitored by Snyk (if the `skipSnykMonitoredRepos` flag was **not set** and the `SNYK_TOKEN` was exported) and create a Repo list.
5. Create an import file for unmonitored repos to use for easily importing repos into your Snyk account (if the `importConfDir` flag was set)
6. Fetch the commits for the fetched/provided repo/s and create a Contributors list.
7. Count the commits for the repo/s by the contributors.
8. Remove the contributors that were specified in the exclusion file (if `the exclusionFilePath` flag was set and a valid path to a text file was provided).
9. Print the results.
## Bitbucket Cloud API endpoints used
* To get the repositories from BB Cloud, if a workspace was **not** provided: `/api/2.0/repositories`
* To get the repositories from BB Cloud, if a workspace/s **was** provided: `/api/2.0/repositories/{Workspace}`
* To get the commits for the fetched/provided repo/s list:`/api/2.0/repositories/{Workspace}/{Repo}/commits`
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/bitbucket-cloud.md
# Source: https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations/bitbucket-cloud.md
# Bitbucket Cloud
{% hint style="info" %}
Snyk recommends installing or migrating to the [Bitbucket Cloud Application](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations/bitbucket-cloud-app) for smoother integration and to ensure long-term support.
{% endhint %}
The Bitbucket Cloud API token integration lets you:
* Continuously perform security scanning across all the integrated repositories
* Detect vulnerabilities in your open-source components
* Provide automated fixes and upgrades
### How to set up the Bitbucket Cloud Integration
{% hint style="info" %}
Admin permissions are required; however, Snyk's access is ultimately limited by the [permissions assigned to the API Token](https://support.atlassian.com/bitbucket-cloud/docs/create-an-api-token/).\
\
To improve security, the use of app passwords in Bitbucket Cloud is transitioning to API tokens. Existing integrations that use app passwords will continue to function temporarily until 9 June 2026.\
To ensure continued support and functionality, update your Bitbucket Cloud integration in Snyk to use an API token.
{% endhint %}
1. To give Snyk access to your Bitbucket account, set up a dedicated service account in Bitbucket with admin permissions. See the [Bitbucket documentation ](https://support.atlassian.com/bitbucket-cloud/docs/grant-access-to-a-workspace/)to learn more about adding users to a workspace.\
The newly created user must have **Admin** permissions to all the repositories you need to monitor with Snyk.
2. In Snyk, go to the **Integrations** page, open the **Bitbucket Cloud** card, and configure the **Account credentials**.
3. In BitBucket, under the Personal settings, select **Atlassian account settings** > **Security** > **Create and manage API tokens**.
4. Follow the Bitbucket procedure to set up an account with the following permissions:
* read:user:bitbucket
* read:workspace:bitbucket
* read:repository:bitbucket
See the [Bitbucket documentation ](https://support.atlassian.com/bitbucket-cloud/docs/create-an-api-token/)for more details about the procedure.
5. Enter the email and the [API key for the Bitbucket account](https://developer.atlassian.com/cloud/bitbucket/rest/intro/#api-tokens) you created, and **save** your changes.\
You can find your email under the Bitbucket **Personal settings.**\
Snyk connects to your Bitbucket Cloud account. When the connection succeeds, the confirmation message **Bitbucket Cloud settings successfully updated** appears.
### How to add Bitbucket repositories to Snyk
After you connect Snyk to your Bitbucket Cloud account, you can select repositories for Snyk to monitor.
1. In Snyk, go to **Integrations** > **Bitbucket Cloud** card, and click **Add your Bitbucket Cloud repositories to Snyk** to start importing repositories to Snyk.
2. Choose the repositories you want to import to Snyk and click **Add selected repositories**.
After you add the selected repositories, Snyk scans them for dependency files in the entire directory tree, that is, `package.json`, `pom.xml`, and so on, and imports them to Snyk as Projects.
The imported projects appear on your **Projects** page and are continuously checked for vulnerabilities.
### Bitbucket integration features
After the integration is in place, you will be able to use capabilities such as:
* [Project-level security reports](#project-level-security-reports)
* [Project monitoring and automatic fix pull requests](#project-monitoring-and-automatic-fix-pull-requests)
* [Pull request testing](#pull-request-tests)
#### Project-level security reports
Snyk produces advanced [security reports](https://docs.snyk.io/manage-risk/reporting/legacy-reports/legacy-reports-overview) that let you explore the vulnerabilities found in your repositories and fix them immediately by opening a fix pull request directly to your repository, with the required upgrades or patches.
The example that follows shows a Project-level security report.
An example of a Project-level security report
#### Project monitoring and automatic fix Pull Requests
Snyk scans your Projects on either a daily or a weekly basis. When new vulnerabilities are found, Snyk notifies you by email and by opening [automated pull requests](https://docs.snyk.io/scan-with-snyk/pull-requests/snyk-pull-or-merge-requests/create-automatic-prs-for-new-fixes-fix-prs) with fixes for your repositories.
The example that follows shows a fix Pull Request opened by Snyk.
Example of an automatic fix Pull Request opened by Snyk
To review and adjust the automatic fix pull request settings:
1. In Snyk, go to **Organization settings** > **Integrations** > **Source control** > **Bitbucket Cloud**, and click **Edit Settings**.
2. Scroll to the **Automatic fix PRs** section and configure the relevant options.
Configure Automatic fix PRs
{% hint style="info" %}
Unlike manual pull requests opened from the Bitbucket interface, Snyk pull requests are *not* automatically assigned to the default reviewer set in your Bitbucket Cloud account.
For more information, see [Snyk automated pull requests](https://docs.snyk.io/scan-with-snyk/pull-requests/snyk-pull-or-merge-requests/create-automatic-prs-for-new-fixes-fix-prs).
{% endhint %}
#### Pull request tests
Snyk tests any newly-created pull request in your repositories for security vulnerabilities and sends a build check to Bitbucket Cloud. You can see directly from Bitbucket Cloud whether or not the pull request introduces new security issues.
The example that follows shows a Snyk pull request build check on the Bitbucket Cloud **Pull Request** page.
Example of a Snyk pull request build check on the Bitbucket Cloud Pull Request page
To review and adjust the pull request tests settings:
1. In Snyk, go to **Organization settings** > **Integrations > Source control** > **Bitbucket Cloud**, and click **Edit Settings**.
2. Scroll to **Default Snyk test for pull requests** > **Open Source Security & Licenses**, and configure the relevant options.
Configuring the options for pull request Open Source Security & Licenses
### Required permission scope for the Bitbucket Cloud integration
{% hint style="warning" %}
Bitbucket Cloud has replaced App Passwords with API tokens\
Existing credentials will continue to work normally until completely deprecated by Bitbucket Cloud [details here](https://www.atlassian.com/blog/bitbucket/bitbucket-cloud-transitions-to-api-tokens-enhancing-security-with-app-password-deprecation).\
New integrations will now use API tokens.
{% endhint %}
All the operations, whether triggered manually or automatically, are performed for a Bitbucket Cloud [service account](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts) that has its token (API Token) configured in the **Integration settings**.
For Snyk to perform the required operations on monitored repositories, such as reading manifest files on a frequent basis and opening fix or upgrade PRs, the integrated Bitbucket Cloud service account needs **Admin** permissions on the imported repositories.
For detailed information on the permission scopes required, see [Bitbucket permission requirements](https://docs.snyk.io/developer-tools/user-permissions-and-access-scopes#bitbucket-cloud-and-bitbucket-data-center-server-scopes).
### How to disconnect Snyk from Bitbucket Cloud
{% hint style="warning" %}
When you disconnect Snyk from your repository Projects, your credentials are removed from Snyk, and any integration-specific Projects that Snyk is monitoring are deactivated in Snyk.\
If you choose to re-enable this integration, you must re-enter your credentials and activate your Projects.
{% endhint %}
To disconnect this integration, in **Organization settings** > **Integrations:**
1. In your list of integrations, select the Bitbucket integration you want to deactivate and click **Edit settings** to open a page with the current status of your integration.\
\
The page includes sections that are specific to each integration, where you can manage your credentials, API key, Service Principal, or connection details.
2. Scroll to the relevant section and click **Disconnect.**
### Migrate to the Bitbucket Cloud App
This section describes how to migrate your existing [Bitbucket Cloud API token integration](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations/bitbucket-cloud), displayed in Snyk as Bitbucket Cloud, to the [Bitbucket Cloud App](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations/bitbucket-cloud-app) integration.
To migrate to the new app integration, you must remove all the previously imported Projects from Snyk, delete the API token and its Projects, set up the new app integration, and reimport your Projects to Snyk from the new integration.
{% hint style="info" %}
Before going through the migration process, you should note that the following Project-level information will not persist:
* Historic Project-related data, including trend numbers for fixing vulnerabilities
* Project-related metadata: ignores and tags
{% endhint %}
### Migration process
The migration process includes the following steps:
1. [Deleting the existing Projects](#delete-existing-projects) that are connected to the Bitbucket Cloud API token integration in Snyk.
2. [Disconnecting the PAT integration](#disconnect-the-pat-integration) in Snyk.
3. Removing the first-party extension for the PAT integration in Bitbucket (optional). This step is explained in the [Disconnect the PAT integration](#disconnect-the-pat-integration) section.
4. [Connecting the Bitbucket Cloud App](#set-up-the-bitbucket-cloud-app-integration) and importing Projects.
#### Delete existing Projects
Delete all the existing Projects in Snyk that were previously imported from the Legacy integration. To use the bulk delete action on the Projects page, change the grouping filter to **Group by none**. You can now select multiple Projects in the list individually or by selecting the checkbox at the top to **Select all visible projects**. To delete a Project, select the trash icon, **Delete selected projects**.
Change the Projects filter to Group by none
Bulk delete the selected Projects
#### Disconnect the PAT integration
To disconnect the Bitbucket Cloud PAT integration, navigate to the settings page of Bitbucket Cloud integration, scroll to the relevant section, and click **Disconnect.**
Remove the Snyk tab for the PAT integration in Bitbucket Cloud (optional)
The Bitbucket Cloud integration has an optional first-party interface app.
This app can be installed on your Bitbucket Cloud workspace to enrich the PAT integration with a first-party interface as the Snyk tab)
If you have used this app, before setting up the Snyk Bitbucket Cloud App in the next step, remove the previous interface app in Bitbucket Cloud.\
This functionality is supported out-of-the-box in the Snyk App integration.\
\
Go to your **Workspace settings** page in **Bitbucket.org** > **Manage installed apps**, expand the **Snyk Security for Bitbucket Cloud** app, and click **Remove.**
Remove the first-party Snyk Legacy interface app in Bitbucket
#### Set up the Bitbucket Cloud App integration
See the [Bitbucket Cloud App integration](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations/bitbucket-cloud-app) topic for instructions.
---
# Source: https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations/bitbucket-data-center-server.md
# Bitbucket Data Center/Server
The Bitbucket Data Center/Server integration allows you to continuously perform security scanning across all the integrated repositories, detect vulnerabilities in your open-source components, and use automated fixing. This integration supports Bitbucket Data Center/Server versions 4.0 and above.
For a quick reference, see the [Snyk and Bitbucket best practices cheat sheet](https://snyk.io/blog/snyk-bitbucket-best-practices-cheat-sheet/) on the Snyk blog.
### How to set up a Bitbucket DC/Server Integration
1. To give Snyk access to your Bitbucket DC/Server account, set up a dedicated service account in Bitbucket DC/Server with admin permissions.\
Visit [Bitbucket Server documentation ](https://confluence.atlassian.com/bitbucketserver/users-and-groups-776640439.html#Usersandgroups-Creatingauser)to learn more about creating users.\
Ensure the newly-created user has **Admin** permissions to all the repositories you need to monitor with Snyk.
2. In Snyk, navigate to the **Integrations** page and click on the **Bitbucket Server** card.
3. Enter your Bitbucket DC/Server URL and the username and password for the service account you created. Alternatively, you can create a [personal access token](https://confluence.atlassian.com/bitbucketserver075/personal-access-tokens-1018784848.html) and use it instead of a password.
1. If your Bitbucket DC/Server instance has Basic Auth disabled, you must use a personal access token.
2. Specify `x-access-token` for the username, and provide the personal access token instead of a password.
4. **Save** your changes.\
Snyk connects to your Bitbucket DC/Server instance.\
When the connection succeeds, a confirmation message appears on your integrations screen.
### How to import Bitbucket Server repositories
To select the repositories for Snyk to monitor:
1. Click **Add your Bitbucket Server repositories to Snyk** to start importing repositories to Snyk.
2. When prompted, select the repositories to import to Snyk and click **Add selected repositories**.
After they are added, Snyk scans the selected repositories for dependency files in the entire directory tree, (that is, `package.json`, `pom.xml`, and so on) and imports them to Snyk as projects.\
\
The imported Projects appear on your Snyk **Projects** page and are continuously checked for vulnerabilities.
### Bitbucket DC/Server Integration Features
After the integration is in place, you can use capabilities such as:
* [Project-level security reports](#project-level-security-reports)
* [Project monitoring and automatic fix pull requests](#project-monitoring-and-automatic-fix-pull-requests)
* [Pull request testing](#pull-request-tests)
#### Project-level security reports
Snyk produces advanced [security reports](https://docs.snyk.io/manage-risk/reporting/legacy-reports/legacy-reports-overview) that let you explore the vulnerabilities found in your repositories and fix them immediately by opening a fix pull request directly to your repository with the required upgrades or patches.
The example that follows shows a Project-level security report.
Project-level security report
#### Project monitoring and automatic fix pull requests
Snyk scans your Projects on either a daily or a weekly basis. When new vulnerabilities are found, Snyk notifies you by email and by opening automated Snyk pull requests with fixes for your repositories.
The example that follows shows a fix pull request opened by Snyk.
Pull request opened by Snyk
To review and adjust the automatic fix pull request settings:
1. In Snyk, navigate to **Organization settings** > **Integrations** > **Source control** > **Bitbucket Server**, and click **Edit Settings**.
2. Scroll to the **Automatic fix PRs** section and configure the relevant options.
Automatic fix PR settings
{% hint style="info" %}
Snyk pull requests are automatically assigned to the default reviewer set in your Bitbucket Server/Data Center account.
Unlike manual pull requests opened from the Bitbucket interface, for the Snyk Bitbucket Cloud integration, Snyk pull requests are *not* automatically assigned to the default reviewer set in your Bitbucket Cloud account.
For more information, see [Automated pull request creation for new fixes](https://docs.snyk.io/scan-with-snyk/pull-requests/snyk-pull-or-merge-requests/create-automatic-prs-for-new-fixes-fix-prs).
{% endhint %}
#### Pull request tests
{% hint style="info" %}
Snyk Code PR Checks are only available for Bitbucket DC/Server versions 7.0 and above.
{% endhint %}
Snyk tests any newly created pull request in your repositories for security vulnerabilities and sends a build check to Bitbucket DC/Server. You can see directly from Bitbucket DC/Server whether or not the pull request introduces new security issues.
The example that follows shows a Snyk pull request build check on the Bitbucket DC/Server **Pull Request** page.

To review and adjust the pull request tests settings:
1. In Snyk, navigate to **Organization settings** > **Integrations** > **Source control** > **Bitbucket Server** and click **Edit Settings**.
2. Scroll to **Default Snyk test for pull requests** > **Open Source Security & Licenses**, and configure the relevant options.
Configure PR Checks for your integration
#### Required Builds
The Snyk integration for Bitbucket Data Center/Server now supports the[ Required Builds](https://confluence.atlassian.com/bitbucketserver/checks-for-merging-pull-requests-776640039.html) feature for granular control over pull requests.
You can select which Snyk security checks must pass before a merge can occur. Snyk reports distinct build statuses for different scan types (security vulnerabilities, license compliance, and code analysis), so you can configure your Bitbucket Data Center/Server repository to enforce specific security gates.
To configure, open a pull request for the imported repository to make the build statuses available, then add them in the **Required builds** settings by navigating.
To configure:
1. Open a pull request from the imported repository to make the build statuses available in the settings list within Bitbucket.
2. In the Bitbucket **Repository settings**, navigate to **Required builds** then **Add builds**.\
You can select the protected targets branch by name, pattern, or model, and specify any source branches which do not have to pass before merging to the target branch.
### Required permission scopes for the Bitbucket DC/Server integration
Snyk performs all the Bitbucket DC/Server operations on behalf of the integrated service account.
For Snyk to perform the required operations on monitored repositories, such as reading manifest files on a frequent basis and opening fix or upgrade PRs, the integrated Bitbucket DC/Server service account needs **Admin** permissions on the imported repositories.
**Admin** permissions are also needed to set secure webhooks. Snyk relies on webhooks to perform a variety of tasks, from PR checks, to commit tests upon merge events, and upcoming auto imports. To ensure the events come from your system and your system only, with no tampering or spoofing, we secure the webhooks using the recommended method shared by the systems we are connecting to. For Bitbucket Server, please see [this link](https://urldefense.proofpoint.com/v2/url?u=https-3A__confluence.atlassian.com_bitbucketserver_manage-2Dwebhooks-2D938025878.html-23Managewebhooks-2DwebhooksecretsSecuringyourwebhook\&d=DwMGaQ\&c=wwDYKmuffy0jxUGHACmjfA\&r=Ck2O4F9WHPBs7KXjKQbW8v6LYdkZzI7TbBwtHf0DvoQ\&m=aKqZjXlWOErUgMQ_jsYYcqqKiHpEYfZS1BT-ru1umJEnIorIvvNt1QshbHugekFP\&s=khA_g0Unp0YP0qTeBtQyma-KHpa1vgWwT0kzcA5tQr0\&e=).\
To do this, a secret token is generated for each secure webhook we create. Snyk setting the webhooks resolves scalability constraints, eliminates token leakage, and reduces the integration workload for you.
For detailed information on the permission scopes required, see [Bitbucket permission requirements](https://docs.snyk.io/developer-tools/user-permissions-and-access-scopes#bitbucket-cloud-and-bitbucket-data-center-server-scopes).
### **How to disconnect the Bitbucket Data Center/Server integration**
{% hint style="warning" %}
When you disconnect Snyk from your Bitbucket repository projects, your credentials are removed from Snyk, and any integration-specific projects that Snyk is monitoring are deactivated in Snyk.\
To re-enable this integration later, you must re-enter your credentials and activate your Projects.
{% endhint %}
To disable this integration, in **Organization settings** > **Integrations**, follow these steps:
1. In your list of integrations, select the Bitbucket integration you want to deactivate and click **Edit settings** to open a page with the current status of your integration.\
\
The page includes sections specific to each integration, where you can manage your credentials, API key, Service Principal, or connection details.
2. Scroll to the relevant section and click **Remove Bitbucket Server.**
### Migration from Bitbucket Server to Bitbucket Data Center
Usually, migrating from Bitbucket Server to Bitbucket Data Center requires no further action. The Snyk integration should keep working as Bitbucket Server and Bitbucket Data Center APIs are identical.
Action is required only when the new Bitbucket Data Center instance URL differs from the Bitbucket Server instance URL. In this case, you must reconnect the integration from the Bitbucket Server-Bitbucket Data Center integration page in Snyk.io. To reconnect, follow the steps in [How to set up a Bitbucket DC/Server Integration](#how-to-set-up-a-bitbucket-dc-server-integration).
---
# Source: https://docs.snyk.io/developer-tools/scm-integrations/group-level-integrations/bitbucket-for-snyk-essentials.md
# Bitbucket for Snyk Essentials
The Integrations page shows all active integrations, including data from your existing Snyk Organizations that is automatically synced and provides access to the Integration Hub.
{% hint style="info" %}
The Bitbucket Cloud App is not supported at the Group level. The available options at the Group level are BitBucket Cloud and BitBucket Server.
To improve security, the use of app passwords in Bitbucket Cloud is transitioning to API tokens. Existing integrations that use app passwords will continue to function temporarily until 9 June 2026. To ensure continued support and functionality, update your Bitbucket Cloud integration in Snyk to use an API token.
{% endhint %}
{% hint style="info" %}
Bitbucket Server and Bitbucket Cloud do not support automatic language detection. You can manually add language tags to a Bitbucket Cloud repository.\
After manually setting up the languages in your Bitbucket project, Snyk can automatically detect and ingest all those languages in your Snyk Essentials application.
{% endhint %}
## Pulled entities
* Users
* Repositories
#### Prerequisites
To configure a Group-level integration, you must be a Group Admin or have a custom role that includes the `Edit Snyk Essentials` permissions under the [Group-level permissions](https://docs.snyk.io/snyk-platform-administration/user-roles/pre-defined-roles#group-level-permissions).
## Integrate using Snyk Essentials
1. Profile name (`mandatory`): Input your integration profile name.
2. Service type (`mandatory`): Select the service type, Cloud, or Server.
3. User email (`mandatory`): Atlassian account email.
4. API Token (`mandatory`) to create your BitBucket [API Token](https://support.atlassian.com/bitbucket-cloud/docs/create-an-api-token/) from your BitBucket Organization, with the following permissions:
* `read:user:bitbucket`
* `read:workspace:bitbucket`
* `read:repository:bitbucket`
5. Broker Token (`mandatory`) to create and add your Broker token if you use Snyk Broker.
This step is only for BitBucket Server that are not reachable through the internet.
Generate your Broker token by following the instructions from the [Obtain your Broker token for Snyk Broker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/prepare-snyk-broker-for-deployment/obtain-the-tokens-required-to-set-up-snyk-broker) page. Copy and paste the Broker token on the integration setup menu from the Integration Hub.
6. Add Backstage Catalog (`optional`). If you want to add your Backstage catalog, follow the instructions from the [Backstage file for SCM Integrations](https://docs.snyk.io/developer-tools/scm-integrations/application-context-for-scm-integrations) page.
## API version
You can use the [BitBucket REST API V2](https://developer.atlassian.com/bitbucket/api/2/reference/resource/) repository to access information about the API.
---
# Source: https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/bitbucket-pipelines-integration-using-a-snyk-pipe/bitbucket-pipelines-integration-how-it-works.md
# Bitbucket Pipelines integration: how it works
After you have added the Snyk pipe to the pipeline, each time the pipeline executes (by any trigger type), the Snyk pipe performs the following actions:
## **Scan**
1. Snyk scans app dependencies or container images for vulnerabilities and open-source license issues, and lists the vulnerabilities and issues.
2. If Snyk finds vulnerabilities, it does one of the following (based on your configuration):
* Fails the build
* Lets the build complete
## **Monitor**
Optionally, if the build completes successfully and **MONITOR** is set to **True** in the Snyk step, then Snyk saves a snapshot of the Project dependencies from the Snyk Web UI. From the Snyk Web UI, you can view the dependency tree displaying all of the issues and receive alerts for new issues found in the existing app version.
---
# Source: https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/bitbucket-pipelines-integration-using-a-snyk-pipe.md
# Bitbucket Pipelines integration using a Snyk pipe
Snyk integrates with Bitbucket Pipelines using a Snyk pipe, seamlessly scanning your application dependencies and Docker images for security vulnerabilities as part of the continuous integration/continuous delivery (CI/CD) workflow.
[Bitbucket Pipes](https://bitbucket.org/blog/meet-bitbucket-pipes-30-ways-to-automate-your-ci-cd-pipeline) enables users to customize and automate a Bitbucket Pipeline CI/CD workflow with a group of ready-to-use tasks that can be added inside of your pipelines by copying and pasting them from the Bitbucket interface.
With the Snyk pipe, you can quickly add Snyk scanning to your pipelines to test and monitor for vulnerabilities at different points in the CI/CD workflow, based on your configurations. Results are then displayed in the Bitbucket Pipelines output view and can also be monitored on the [Snyk Web UI](http://app.snyk.io).
## Snyk pipe information in Bitbucket
From the build directory, Bitbucket Pipelines displays a list of available pipes customized for you, similar to the list in the following screen image:

On this list, find and click **Snyk** to view the pipe, examples, parameters, and values:

## Setup and use details
For setup and use details, see the following pages:
* [Language support for Bitbucket Pipelines integration](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/bitbucket-pipelines-integration-using-a-snyk-pipe/language-support-for-bitbucket-pipelines-integration)
* [Bitbucket Pipelines integration: how it works](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/bitbucket-pipelines-integration-using-a-snyk-pipe/bitbucket-pipelines-integration-how-it-works)
* [Prerequisites for Bitbucket Pipelines integration](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/bitbucket-pipelines-integration-using-a-snyk-pipe/prerequisites-for-bitbucket-pipelines-integration)
* [Configure your Bitbucket Pipelines integration](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/bitbucket-pipelines-integration-using-a-snyk-pipe/configure-your-bitbucket-pipelines-integration)
* [How to add a Snyk pipe](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/bitbucket-pipelines-integration-using-a-snyk-pipe/how-to-add-a-snyk-pipe)
* [Snyk pipe parameters and values (Bitbucket Cloud)](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/bitbucket-pipelines-integration-using-a-snyk-pipe/snyk-pipe-parameters-and-values-bitbucket-cloud)
* [Snyk pipe examples for Bitbucket Cloud](https://bitbucket.org/snyk/snyk-scan/src/develop/README.md)
---
# Source: https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/snyk-images-guides-to-migration/bitbucket-pipelines-migration.md
# BitBucket Pipelines migration
## For users of `snyk/snyk-scan` < v1.0.0
`snyk/snyk-scan` \= v1.0.0
## For users of `snyk/snyk-scan` >= v1.0.0
### Create your own custom image
Users can create their own custom images to use. This option is available for `snyk/snyk-scan` >= v1.0.0 only. For details, see [User-defined custom images for CLI.](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/user-defined-custom-images-for-cli)
Creating a custom image should guarantee compatibility with your system. However, there are alternative images to which you can upgrade if creating a custom image is not possible.
### Upgrade to a supported Snyk Image
After you have validated that you are using a Snyk Image that will be removed, as outlined [for users of `snyk/snyk-scan` < v1.0.0](#users-using-snyk-snyk-scan-less-than-v1.0.0), refer to the [Snyk images migration](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/snyk-images-guides-to-migration/snyk-images-migration) guidelines to view upgrade paths for your configuration.
{% hint style="info" %}
Remember to use pinned versions where available for better stability. for example, `snyk/snyk:dotnet-8.0` is preferable to `snyk/snyk:dotnet`
{% endhint %}
An example follows of upgrading to a supported Snyk Image.
In the example `bitbucket-pipeline.yml` configuration that follows, a Snyk image is configured that will be removed on 12 Aug 2024:
```yaml
# Example bitbucket-pipelines.yml using `snyk/snyk:node-16` Snyk Image
# Template NodeJS build
# This template allows you to validate your NodeJS code.
# The workflow allows running tests and code linting on the default branch.
image: atlassian/default-image:latest
pipelines:
default:
- parallel:
- step:
name: Build
caches:
- node
script:
- npm install
- step:
name: Snyk scan
script:
- pipe: snyk/snyk-scan:1.0.1
variables:
SNYK_TOKEN: $SNYK_TOKEN
LANGUAGE: "node-16" # <------ Using the `snyk/snyk:node-16` Snyk Image
EXTRA_ARGS: "--all-projects" # Optional
DEBUG: "true" # Optional
```
Following the [Snyk images migration](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/snyk-images-guides-to-migration/snyk-images-migration) guidelines, you can upgrade to a supported Snyk Image as shown here:
```yaml
# Upgrading to supported Snyk Image `snyk/snyk:node-22`
# Template NodeJS build
# This template allows you to validate your NodeJS code.
# The workflow allows running tests and code linting on the default branch.
image: atlassian/default-image:latest
pipelines:
default:
- parallel:
- step:
name: Build
caches:
- node
script:
- npm install
- step:
name: Snyk scan
script:
- pipe: snyk/snyk-scan:1.0.1
variables:
SNYK_TOKEN: $SNYK_TOKEN
LANGUAGE: "node-22" # <------ Upgrade to the `snyk/snyk:node-22` Snyk Image
EXTRA_ARGS: "--all-projects" # Optional
DEBUG: "true" # Optional
```
## Download and install Snyk CLI directly
If you do not want to use the Bitbucket `snyk/snyk-scan` integration, you have the option to install and use the Snyk CLI directly.
{% hint style="info" %}
If you use this option, you will be unable to use integration features such as Code Insight Results
{% endhint %}
The following example shows using the CLI directly.
In the example `bitbucket-pipeline.yml` configuration that follows, a pipeline is configured that does the following:
* Downloads the CLI
* Validates the CLI with a SHASUM check
* Runs the CLI to test the code
```yaml
image: node:18
pipelines:
default:
- step:
name: Build
caches:
- node
script:
- npm install
- step:
name: Snyk scan
script:
# Download Snyk Linux CLI
- curl https://downloads.snyk.io/cli/latest/snyk-linux -o snyk-linux
# Download Snyk Linux CLI SHASUM
- curl https://downloads.snyk.io/cli/latest/snyk-linux.sha256 -o snyk.sha256
# Validate binary using SHASUM
- sha256sum -c snyk.sha256
# Configure CLI for executtion
- chmod +x snyk-linux
- mv snyk-linux /usr/local/bin/snyk
# Run Snyk CLI
- snyk test --all-projects -d
```
\\
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker/bitbucket-server-data-center-environment-variables-for-snyk-broker-basic-auth.md
# Bitbucket Server/Data Center - environment variables for Snyk Broker Basic Auth
The following environment variables are required to configure the Broker client:
* `BROKER_TOKEN` - the Snyk Broker token, obtained from your Bitbucket Server integration settings view (app.snyk.io).
* `BROKER_SERVER_URL` - the URL of the Broker server for the region in which your data is hosted. For the commands and URLs to use, see [Broker URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#broker-server-urls).
* `BITBUCKET_USERNAME` - the Bitbucket Server username.
* `BITBUCKET_PASSWORD` - the Bitbucket Server password. You can use an API token in place of a password.
* `BITBUCKET` - the hostname of your Bitbucket Server deployment, such as `your.bitbucket-server.domain.com`.
* `BITBUCKET_API` - the API endpoint of your Bitbucket Server deployment. Should be `$BITBUCKET/rest/api/1.0`.
* `BROKER_CLIENT_URL` - the full URL of the Broker client as it will be accessible to your Bitbucket Server for webhooks, such as `http://broker.url.example:8000.`This URL is required to access features such as PR Fixes or PR Checks.
* This must have http\:// and the port number.
* To configure the client with HTTPS, [additional settings are required](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/https-for-broker-client-with-docker).
* `PORT` - the local port at which the Broker client accepts connections. Default is 8000.
* `ACCEPT_IAC` - by default, some file types used by Infrastructure-as-Code (IaC) are not enabled. To grant the Broker access to IaC files in your repository, such as Terraform for example, you can simply add an environment variable `ACCEPT_IAC` with any combination of `tf,yaml,yml,json,tpl`.
* `ACCEPT_CODE` - by default, Snyk Code will not load code snippets. To enable code snippets you can add an environment variable `ACCEPT_CODE=true`.
* `ACCEPT_ESSENTIALS` - enable Snyk Essentials to identify your application assets, monitor them, and prioritize the risks. To enable Snyk Essentials, add the environment variable `ACCEPT_ESSENTIALS=true`.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker/bitbucket-server-data-center-environment-variables-for-snyk-broker-personal-access-token-pat.md
# Bitbucket Server/Data Center - environment variables for Snyk Broker Personal Access Token (PAT)
The following environment variables are required to configure the Broker client:
* `BROKER_TOKEN` - the Snyk Broker token, obtained from your Bitbucket Server integration settings view (app.snyk.io).
* `BROKER_SERVER_URL` - the URL of the Broker server for the region in which your data is hosted. For the commands and URLs to use, see [Broker URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#broker-server-urls).
* `BITBUCKET_PAT` - the Bitbucket Server Personal Access Token.
* `BITBUCKET` - the hostname of your Bitbucket Server deployment, such as `your.bitbucket-server.domain.com`.
* `BITBUCKET_API` - the API endpoint of your Bitbucket Server deployment. Should be `$BITBUCKET/rest/api/1.0`.
* `BROKER_CLIENT_URL` - the full URL of the Broker client as it will be accessible to your Bitbucket Server for webhooks, such as `http://broker.url.example:8000.`This URL is required to access features such as PR Fixes or PR Checks.
* This must have http\:// and the port number.
* To configure the client with HTTPS, [additional settings are required](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/https-for-broker-client-with-docker).
* `PORT` - the local port at which the Broker client accepts connections. Default is 8000.
* `ACCEPT_IAC` - by default, some file types used by Infrastructure-as-Code (IaC) are not enabled. To grant the Broker access to IaC files in your repository, such as Terraform for example, you can simply add an environment variable `ACCEPT_IAC` with any combination of `tf,yaml,yml,json,tpl`.
* `ACCEPT_CODE` - by default, Snyk Code will not load code snippets. To enable code snippets you can add an environment variable `ACCEPT_CODE=true`.
* `ACCEPT_ESSENTIALS` - enable Snyk Essentials to identify your application assets, monitor them, and prioritize the risks. To enable Snyk Essentials, add the environment variable `ACCEPT_ESSENTIALS=true`.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker/bitbucket-server-data-center-install-and-configure-using-docker.md
# Bitbucket Server/Data Center - install and configure using Docker
Before installing, review the [prerequisites](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker) and the general instructions for installation using [Docker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/install-and-configure-broker-using-docker).
This integration is useful to ensure a secure connection with your on-premise Bitbucket deployment.
This page describes two distinct authentication schemes: using [Basic Auth](#configure-broker-to-be-used-with-bitbucket-using-basic-auth) and [using an API token](#configure-broker-to-be-used-with-bitbucket-using-an-api-token). Your Bitbucket Server settings might preclude Basic Auth usage, in which case Bearer Auth is preferred.
## Configure Broker to be used with Bitbucket using Basic Auth
The following explains how to configure Snyk Broker to use the Broker Client with a Bitbucket Server deployment.
To use the Snyk Broker Client with BitBucket, run `docker pull snyk/broker:bitbucket-server`. Refer to [BitBucket Server/Data Center - environment variables](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker/bitbucket-server-data-center-environment-variables-for-snyk-broker-basic-auth) for Snyk Broker for definitions of the environment variables.
If necessary, navigate to the [Advanced configuration page](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-snyk-broker-docker-installation) and make any configuration changes needed, such as providing the CA (Certificate Authority) to the Broker Client configuration if the Bitbucket instance is using a private certificate, and setting up [proxy support](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-snyk-broker-docker-installation/proxy-support-with-docker).
## Docker run command to set up a Broker Client for Bitbucket using Basic Auth
Copy the following command to set up a fully configured Broker Client to analyze Open Source, IaC, Container, Code files, and Snyk Essentials information.
{% hint style="info" %}
**Multi-tenant settings for regions**\
When installing, you must add a command in your script to set the `BROKER_SERVER_URL`. This is the URL of the Broker server for the region where your data is hosted. For the commands and URLs to use, see [Broker URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#broker-server-urls).
{% endhint %}
```bash
docker run --restart=always \
-p 8000:8000 \
-e BROKER_TOKEN= \
-e BROKER_SERVER_URL= \
-e BITBUCKET_USERNAME= \
-e BITBUCKET_PASSWORD= \
-e BITBUCKET= \
-e BITBUCKET_API= \
-e PORT=8000 \
-e BROKER_CLIENT_URL= \
-e ACCEPT_IAC=tf,yaml,yml,json,tpl \
-e ACCEPT_CODE=true \
-e ACCEPT_Essentials=true \
snyk/broker:bitbucket-server
```
{% hint style="info" %}
Snyk Essentials is set by default to **`false`**. Enable it by setting the flag to **`true`**.
{% endhint %}
## Configure Broker to be used with Bitbucket using an API token
The following explains how to configure Snyk Broker to use the Broker Client with a Bitbucket Server deployment using an API token.
To use the Snyk Broker Client with BitBucket, **run** `docker pull snyk/broker:bitbucket-server-bearer-auth`. For definitions of the environment variables, refer to [Bitbucket Server/Data Center - environment variables for Snyk Broker Basic Auth](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker/bitbucket-server-data-center-environment-variables-for-snyk-broker-basic-auth) and [Bitbucket Server/Data Center - environment variables for Snyk Broker Personal Access Token (PAT)](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker/bitbucket-server-data-center-environment-variables-for-snyk-broker-personal-access-token-pat).
If necessary, go to the [Advanced configuration page](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-snyk-broker-docker-installation) and make any configuration changes needed, such as providing the CA (Certificate Authority) to the Broker Client configuration if the Bitbucket instance is using a private certificate, and setting up [proxy support](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-snyk-broker-docker-installation/proxy-support-with-docker).
## Docker run command to set up a Broker Client for Bitbucket using a PAT
Copy the following command to set up a fully configured Broker Client to analyze Open Source, IaC, Container, Code files, and Snyk Essentials information. Enable Snyk Essentials to identify your application assets, monitor them, and prioritize the risks.
```bash
docker run --restart=always \
-p 8000:8000 \
-e BROKER_TOKEN= \
-e BITBUCKET_PAT= \
-e BITBUCKET= \
-e BITBUCKET_API= \
-e PORT=8000 \
-e BROKER_CLIENT_URL= \
-e ACCEPT_IAC=tf,yaml,yml,json,tpl \
-e ACCEPT_CODE=true \
-e ACCEPT_Essentials=true \
snyk/broker:bitbucket-server-bearer-auth
```
{% hint style="info" %}
Snyk Essentials is set by default to `false`. Enable it by setting the flag to `true`.
{% endhint %}
## Start the Broker Client container and verify the connection with Bitbucket
Paste the Broker Client configuration to start the Broker Client container.
Once the container is up, the Bitbucket Integrations page shows the connection to Bitbucket, and you can `Add Projects`
## Basic troubleshooting for Broker with BitBucket
* Run `docker logs ` to look for any errors, where `container id` is the Bitbucket Broker container ID.
* Ensure relevant ports are exposed to Bitbucket.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker/bitbucket-server-data-center-install-and-configure-using-helm.md
# Bitbucket Server/Data Center - install and configure using Helm
Before installing, review the [prerequisites](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker) and the general instructions for installation using [Helm](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/install-and-configure-broker-using-helm).
To use this chart, you must first add the Snyk Broker Helm Chart by adding the repo:
`helm repo add snyk-broker https://snyk.github.io/snyk-broker-helm/`
Then, run the following commands to install the Broker and customize the environment variables. For definitions of the environment variables, refer to [Bitbucket Server/Data Center - environment variables for Snyk Broker Basic Auth](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker/bitbucket-server-data-center-environment-variables-for-snyk-broker-basic-auth) and [Bitbucket Server/Data Center - environment variables for Snyk Broker Personal Access Token (PAT)](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker/bitbucket-server-data-center-environment-variables-for-snyk-broker-personal-access-token-pat).
For `bitbucket` and `bitbucketApi` values do not include `https://`
Snyk Essentials is set by default to `false`. Enable it by setting the flag to `true`.
{% hint style="info" %}
**Multi-tenant settings for regions**\
When installing, you must add a command in your script to set the `brokerServerUrl`. This is the URL of the Broker server for the region where your data is hosted. For the commands and URLs to use, see [Broker URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#broker-server-urls).
{% endhint %}
Use the following command to configure Broker to be used with Bitbucket Server using Basic Auth:
```
helm install snyk-broker-chart snyk-broker/snyk-broker \
--set scmType=bitbucket-server \
--set brokerToken= \
--set brokerServerUrl=
--set bitbucketUsername= \
--set bitbucketPassword= \
--set bitbucket= \
--set bitbucketApi= \
--set brokerClientUrl=: \
--set enableEssentials=true \
-n snyk-broker --create-namespace
```
Use the following command to configure Broker to be used with Bitbucket Server using Bearer Auth (Personal Access Token):
```
helm install snyk-broker-chart snyk-broker/snyk-broker \
--set scmType=bitbucket-server-bearer-auth \
--set brokerToken= \
--set bitbucketPat= \
--set bitbucket= \
--set bitbucketApi= \
--set brokerClientUrl=: \
--set enableEssentials=true \
-n snyk-broker --create-namespace
```
You can pass any environment variable of your choice in the Helm command. For details, see [Custom additional options for Broker Helm Chart](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-helm-chart-installation/custom-additional-options-for-broker-helm-chart-installation). Follow the instructions for [Advanced configuration for Helm Chart installation](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-helm-chart-installation) to make configuration changes as needed.
You can verify that the Broker is running by looking at the settings for your brokered integration in the Snyk Web UI to see a confirmation message that you are connected. You can start importing Projects once you are connected.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker.md
# Bitbucket Server/Data Center - prerequisites and steps to install and configure Broker
{% hint style="info" %}
**Feature availability**
PR Checks for Bitbucket Server integrations require Bitbucket Server version 7.4 and above, or Bitbucket Data Center version 8 or above.\
\
When using a brokered connection, Snyk Broker version 4.206 and above is required.
{% endhint %}
Review the general instructions for the installation method you plan to use, [Helm](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/install-and-configure-broker-using-helm) or [Docker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/install-and-configure-broker-using-docker).
Before installing the Bitbucket Server/Data Center Broker, ensure your Snyk account team provides you with a Broker token.
Docker or an equivalent method is required to run Docker Linux containers. Some Docker setups for Windows only support Windows containers. Ensure your deployment can run Linux containers.
After you meet all the prerequisites, you can continue with the steps to install using [Docker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker/bitbucket-server-data-center-install-and-configure-using-docker) or [Helm](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/bitbucket-server-data-center-prerequisites-and-steps-to-install-and-configure-broker/bitbucket-server-data-center-install-and-configure-using-helm).
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/bitbucket-server/bitbucket-server-examples.md
# Bitbucket Server - Examples
The following options are available for the `snyk-scm-contributors-count bitbucket-server` command:
```
--version Show version number [boolean]
--help Show help [boolean]
--token Bitbucket server token [required]
--url Bitbucket server base url e.g. (https://bitbucket.mycompany.com) [required]
--projectKeys [Optional] Bitbucket server project key to count contributors for
--repo [Optional] Specific repo to count only for
--exclusionFilePath [Optional] Exclusion list filepath
--json [Optional] JSON output, requiered when using the "consolidateResults" command
--skipSnykMonitoredRepos [Optional] Skip Snyk monitored repos and count contributors for all repos
--importConfDir [Optional] Generate an import file with the unmonitored repos: A path to a valid folder for the generated import files
--importFileRepoType [Optional] To be used with the importConfDir flag: Specify the type of repos to be added to the import file. Options: all/private/public. Default: all
```
## Before running the command
1. Export SNYK\_TOKEN (if you want to get the contributors ONLY for repos that are already monitored by Snyk):
* Make sure that your token has Group level access or use a service account's token that has Group level access. To learn more on how to create a service account, refer to [Service accounts](https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts).
* Copy the token value.
* Export the token in your environment:
```
export SNYK_TOKEN=
```
2. Get your Bitbucket Server token and URL:
* Create a Token if one does not exist, using this [guide](https://www.jetbrains.com/help/youtrack/standalone/integration-with-bitbucket-server.html#enable-youtrack-integration-bbserver).
**Note**: Make sure your token has read access to the repos.
* The URL is the actual URL of your Bitbucket Server instance, for example, .
## Running the command
Consider the following levels of usage and options:
### Usage levels
* To get commits for all projects and their repos in Bitbucket Server, provide the Bitbucket Server token and url:
```
snyk-scm-contributors-count bitbucket-server --token BITBUCKET-TOKEN --url BITBUCKET-URL
```
* To get commits for some projects and their repos in Bitbucket Server, provide the Bitbucket Server token, Bitbucket Server url ,and the projects, separated by a comma:
```
snyk-scm-contributors-count bitbucket-server --token BITBUCKET-TOKEN --url BITBUCKET-URL --projectKeys Key1,Key2...
```
* To get commits for a specific repo in Bitbucket Serve, provide your Bitbucket Server token, Bitbucket Server url, a project, and a repo name:
```
snyk-scm-contributors-count bitbucket-server --token BITBUCKET-TOKEN --url BITBUCKET-URL --projectKeys Key1 --repo Repo1
```
### Options
* To get all the commits from Bitbucket Server regardless of the repos that are already monitored by Snyk, add the `--skipSnykMonitoredRepos` flag.\
You might have repos in Bitbucket Server that are not monitored in Snyk,. Use this flag to skip checking for Snyk monitored repos and go directly to Bitbucket Server to fetch the commits.
```
snyk-scm-contributors-count bitbucket-server --token BITBUCKET-TOKEN --url BITBUCKET-URL --skipSnykMonitoredRepos
```
* To exclude some contributors from being counted in the commits, add an exclusion file with the emails to ignore(separated by a new line) and apply the `--exclusionFilePath` with the path to that file:
```
snyk-scm-contributors-count bitbucket-server --token BITBUCKET-TOKEN --url BITBUCKET-URL --projectKeys Key1,Key2 --exclusionFilePath PATH_TO_FILE
```
* To set the output to json format: add the `--json` flag:
```
snyk-scm-contributors-count bitbucket-server --token BITBUCKET-TOKEN --url BITBUCKET-URL --projectKeys Key1 --repo Repo1 --json
```
* To create an import file for me with my unmonitored repos, add the `--importConfDir` flag with a valid (writable) path to a folder in which the import files will be stored and add the `--importFileRepoType` flag (optional) with the repo types to add to the file (`all`/`private`/`public`, defaults to `all`). Note that these flags **can not** be set with the `--repo` flag.
```
snyk-scm-contributors-count bitbucket-server --token BITBUCKET-TOKEN --url BITBUCKET-URL --importConfDir ValidPathToFolder --importFileRepoType private/public/all
```
For more information about these flag, refer to [Creating and using the import file](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/creating-and-using-the-import-file).
* To run in debug mode for verbose output, add `DEBUG=snyk*` to the beginning of the command:
```
DEBUG=snyk* snyk-scm-contributors-count bitbucket-server --token BITBUCKET-TOKEN --url BITBUCKET-URL --projectKeys Key1 --repo Repo1 --exclusionFilePath PATH_TO_FILE --skipSnykMonitoredRepos --json
```
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/bitbucket-server/bitbucket-server-flow-and-tech.md
# Bitbucket Server - Flow and Tech
## Flow
1. Fetch the monitored projects from Snyk (if the `skipSnykMonitoredRepos` flag was **not set** and the `SNYK_TOKEN` was exported).
2. Fetch `one`/`some`/`all` the projects that the credentials have access to from SCM and create a projects list.
3. Fetch `one`/`all` repos under the fetched/provided projects.
4. Remove the repos that are not monitored by Snyk (if the `skipSnykMonitoredRepos` flag was **not set** and the `SNYK_TOKEN` was exported) and create a Repo list.
5. Create an import file for unmonitored repos to use for easily importing repos into Snyk account (if the `importConfDir` flag was set).
6. Fetch the commits for the fetched/provided repo/s and create a Contributors list.
7. Count the commits for the repo/s by the contributors.
8. Remove the contributors that were specified in the exclusion file (if `the exclusionFilePath` flag was set and a valid path to a text file was provided).
9. Print the results.
## Bitbucket Server API endpoints used
* To get the repositories from BB Cloud, if a workspace was **not** provided: `/rest/api/1.0/repos`
* To get the repositories from BB Cloud, if a workspace/s **was** provided: `/rest/api/1.0/projects/{Project}/repos`
* To get the commits for the fetched/provided repo/s list: `/rest/api/1.0/projects/{Project}/repos/{Repo}/commits`
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/bitbucket-server.md
# Bitbucket Server
* [Flow and Tech](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/bitbucket-server/bitbucket-server-flow-and-tech)
* [Examples](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/cli-tools/snyk-scm-contributors-count/scripts-for-scm-contributors-count/bitbucket-server/bitbucket-server-examples)
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-code/manage-code-vulnerabilities/breakdown-of-code-analysis.md
# Breakdown of Code analysis
When you import repositories, Snyk Code automatically tests for vulnerabilities within the imported code. The vulnerabilities detected across all files in a single repository are compiled into a Snyk Project, labeled as Code Analysis. Code Analysis presents the test outcome for a specific repository, listing all discovered vulnerabilities in the repository's source code.
The Code Analysis page
## Code analysis components
This table summarises the elements of a Code analysis Project.
Component Description Header Includes the details of the imported repository with a link to the repository in the Git repository, the Project name, and the Projects tabs: Overview , History , and Settings . The Project Summary Information area Includes the dates of the repository import and the last test of the repository, the Retest now option for an on-demand test, the name of the user who imported the repository, the name of the Project owner, and the number of code files that were analyzed and not analyzed. See Retesting code repository Project filters Includes a set of pre-defined criteria for filtering the displayed issues. Vulnerability issues Includes the vulnerability issues that Snyk Code discovered in the imported repository. Data flow Displays the taint flow of the issue in the code. Fix analysis Provides additional details about the discovered vulnerability type, best practices for preventing this issue, and code examples of fixes. CWE The CWE (Common Weakness Enumeration) ID of the specific vulnerability type and a link to the CWE website, where this vulnerability type is described. See Example: CWE-22: Path Traversal . Open repository external link Quick access to the integrated Git repository for immediate remediation. Imported by The Git repository username who imported the analyzed repository. Project owner Identifies the lead for the Project in your Organization for administrative purposes only; it does not impact the Snyk Project itself. This is not a required field and is left blank by default. Environment The Environment attribute describes the software's context, ranging from client-side (Frontend) to server-side (Backend), private (Internal) to public (External), across platforms like Mobile, Cloud (SaaS), On-Premises, Hosted services, and Distributed systems. Business criticality When the Risk Score feature is enabled, the Business Criticality attribute automatically influences the score, scaling it to the most significant attribute level. See Risk Score . Lifecycle The Lifecycle attribute categorizes a Pro, Development, or Sandbox. Tags The Snyk Project tags feature enables the addition, removal, and use of custom metadata tags to organize and filter Projects, simplifying Project management and navigation. Analysis summary The number of code files analyzed in the repository and the percentage of the analyzed files out of the total repository code files. Repo breakdown Analyzed Files : Includes files that have been reviewed by Snyk Code, encompassing recognized extensions and programming languages.Unanalyzed Files: Consists of text files yet to be analyzed because of unsupported languages or extensions.Unknown : Files without recognized extensions, potentially including multimedia content (such as pictures and videos), binaries, proprietary format files, or any other formats that fall outside Snyk Code's scope of interest.
## Data flow
Data flow shows the location of the discovered issue in your source code and how it flows throughout your application. It shows the taint flow of the issue in the code, with a step-by-step visualization from the source to the sink, presenting the code lines of all the steps in the flow.
The source is the input point of the potential problem. This is a point in the application where a user or an external device can enter data, which will potentially violate the security of the application. For example, in an SQL Injection issue, the source will be a form or any other data input area filled by a user.
The sink is the operation in the code where the application executes the problem. This point must receive clean input, or it can be exploited. For example, in an SQL Injection issue, the sink will be the internal operation that instructs the DB to perform certain actions according to the received input.
Every issue discovered by Snyk Code has a data flow. If an issue has only one step, for example, in the case of hardcoded secrets, the source of the issue will be displayed on the Data flow page.
### View Data flow
1. Log in to the Snyk Web UI and select your [Group and Organization](https://docs.snyk.io/snyk-platform-administration/groups-and-organizations).
2. Navigate to the **Projects** and select the Target folder containing your repository's Projects.
3. Open **Code analysis** Project.
4. (Optional) [Search or filter for a specific vulnerability issue](https://docs.snyk.io/scan-with-snyk/snyk-code/manage-code-vulnerabilities/..#vulnerability-issues).
5. Select a vulnerability issue and navigate to **Full details** > **Data flow**.
6. As part of the Data flow analysis, you can take the following actions:
* View the taint flow of an issue in your code from source to sink. See [Data flow analysis example](#data-flow-analysis-example).
* [Open Data flow external link in the integrated Git repository](#open-data-flow-external-link).
* Ignore the open vulnerability issue using the **Ignore** button. See [Ignore issues](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/ignore-issues).
### Data flow analysis example
In the following Path Traversal issue, the developer has not sanitized the input. This allows an attacker to perform a pass traversal attack to access any file in the file system, including sensitive data such as password files.
Data flow of a Path Traversal vulnerability issue
### Open Data flow external link
To open the displayed source code on the Git repository, select the file name above the right panel. In this example, the file name is "routes/profileImageUrlUpload.ts".
The source code appears in the integrated Git repository, showing you exactly where to fix the vulnerability. You can make the required fix to address the vulnerability in your code.
Vulnerability shown in the external source code
## Fix analysis
Fix analysis helps you fix the vulnerability issue discovered in your code. It provides details about the vulnerability type discovered, any available best practices for preventing this issue, and code examples of fixes from the global open-source community.
To explore in-depth details about the specific vulnerability identified, you can open the CWE link to understand more about the vulnerability type. See [CWE-22 ](#example-cwe-22-path-traversal)and [CWE-601 examples](#example-cwe-601-open-redirect).
Some vulnerabilities contain links to interactive lessons on understanding, fixing, and preventing vulnerability. See [Snyk Learn](https://learn.snyk.io/).
Fix analysis page for Path Traversal vulnerability
### View Fix analysis
1. Log in to the Snyk Web UI and select your [Group and Organization](https://docs.snyk.io/snyk-platform-administration/groups-and-organizations).
2. Navigate to the **Projects** and select the Target folder containing your repository's Projects.
3. Open **Code analysis** Project.
4. (Optional) [Search or filter for a specific vulnerability issue](https://docs.snyk.io/scan-with-snyk/snyk-code/manage-code-vulnerabilities/..#vulnerability-issues).
5. Select a vulnerability issue and navigate to **Full details** > **Fix analysis**.
6. As part of the Fix analysis, you can take the following actions:
* View the discovered issue and ways to prevent it.
* Examine fix examples from the global open-source community by reviewing and browsing through code samples.
* View the code diff of the fix example that appears in the integrated Git repository, showing you how this vulnerability was fixed. See [Open Fix analysis external link in the integrated Git repository](#data-flow-analysis-example).
* Ignore the open vulnerability issue using the **Ignore** button. See [Ignore issues](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/ignore-issues).
The **Fix analysis** page enables you to do the following:
### **Open Fix analysis external link**
To open the code fix for the vulnerability on the Git repository, select the Git repository above the right panel. This will show you the differences in the Git repository code that address the issue. In this example, the Git repository name is "eclipse-vertx/vert.x".
Fix analysis of a Path Traversal vulnerability issue
The fix appears in the Git repository, showing you exactly where to fix the vulnerability. You can make the required fix to address the vulnerability.
Source code external link in Fix analysis
## Severity score factors
Snyk Code reports issues by severity levels: High, Medium, and Low. Snyk Code currently does not use the Critical severity level. The severity score is based on the following factors:
* Qualities intrinsic to a vulnerability
* Evolution of vulnerability over a lifetime
### **Exceptions**
If a vulnerability is detected in code, filename, or folder with the word `test`, it is deemed a low-severity vulnerability. This applies to all languages. The severity of CWEs may change depending on the environment.
## Priority score factors
Use the Priority Score to filter and prioritize discovered issues based on their importance, risk, frequency, and availability of a Fix analysis.
A Priority Score for each issue can be between 0 and 1,000, which changes automatically if one of its factors changes. For example, if the Severity Level of an issue has increased or decreased, the Priority Score of the issue changes accordingly.
You can filter issues in the Code analysis Project by Priority Score using the **PRIORITY SCORE** slider to set the range of the scores you want to display (see [View issues by Priority Score](https://docs.snyk.io/manage-risk/prioritize-issues-for-fixing/priority-score#view-issues-by-priority-score)).
Priority Score factor Description Severity Level The higher the severity level, the higher the security risk of the issue. Each severity level adds a different Score to the issue. The Score can be, at most, 500 points. Currently, Snyk Code does not use the Critical severity level. Fix analysis When an issue has a real-world fix example, it is marked as easier to fix, with a higher Priority Score. The Score can be, at most, 200 points. When Fix analysis is available, it is displayed in the Full Details panel of the issue on the Fix analysis tab. Issue occurrence in a Project The number of times a specific issue appears in the Code Analysis Project. The higher the frequency, the higher the risk and the Score. The Score can be, at most, 100 points. Issue occurrence in a File The number of times an issue appears in a specific file. The higher the frequency, the higher the risk and the Score. The Score can be, at most, 100 points. Community Projects The number of times an issue was fixed in external open-source Projects that Snyk examined. The Score can be, at most, 100 points. Project tags When an issue has a Project tag, it decreases the Priority Score by 100 points. This internal tag can be one of the following:
Test - The issue was found in a test file. Beta - The vulnerability type of the issue is in Beta status. These internal tags are automatically assigned by Snyk Code analysis, and they are not visible on the Web UI.
### Quantitative factors
* Severity scores from other SAST products where information is publicly available
* Severity scores from identifying similar vulnerabilities in the Snyk Vulnerability database
### Qualitative factors
* The severity of source, direct versus indirect
* Prevalence and impact of the sink
* Security team experience and research
* Customer feedback
## Example: CWE-22: Path Traversal
For CWE-22 Path Traversal, if the vulnerability occurs in a test, it is Low severity. If not, and it comes from a direct source, it is High severity. Otherwise, it is Low severity.
Decision flow chart for Priority Score CWE-22 Path Traversal
## Example: CWE-601: Open Redirect
For CWE-2601 Open Redirect, if the vulnerability occurs in a test, it is Low severity. If not, and it comes from a direct source, it is Medium severity.
Decision flow chart for Priority Score CWE-601 Open Redirect
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/universal-broker/broker-client-url.md
# Broker client URL
A webhook from an SCM to Snyk allows the SCM integration to communicate with Snyk. When code is updated, whether the update is to an open-source manifest file or to code files, and the developer creates a pull request, the SCM notifies Snyk through the webhook.
The webhook can be set to reach out to Snyk directly to indicate that files have changed, and Snyk will request the files through Broker. The SCM then sends those files to Snyk, and Snyk scans for code, security, and license issues based on your integrations and the products you have, and returns a pass/fail determination for the SCM check.
Sometimes the webhook cannot communicate directly with Snyk because the infrastructure prevents the repository from reaching outside of the private cloud or data center. Then the Broker facilitates the communication, sending notification of changes to the code, sending the request for files to the SCM, and conveying the files to Snyk, which returns the results of the scan. You can then continue work: review the status, ask for more information, see the issues in the PR if you are using the inline comments capability, and view the details in the Snyk portal.
The same webhook mechanism is used to facilitate communication when Snyk creates a pull request, for example, when you see an issue in the Snyk interface and use the button to fix the vulnerability.
When you set up a connection, the webhook target endpoint is defined by the value of the `broker_client_url` value. The webhook can point directly to Snyk or to the Broker client container, which will relay the webhook to Snyk.
By default, SCM integrations use the regional Snyk API endpoint. For the list of URLs, see [Broker client URLs](https://docs.snyk.io/snyk-data-and-governance/regional-hosting-and-data-residency#broker-client-urls). Sometimes, however, your environment may prohibit SCM webhooks from leaving the private cloud or data center. In that case, the webhook and thus the SCM must point to the Snyk Broker container running in your environment. In these cases, the`broker_client_url` has to reflect the hostname and port of the Broker client.
Non-SCM integrations, notably the container registries integrations, require the Broker client URL to be the Broker client address. Snyk recommends using a DNS host name, such as `http://my.broker.client`, but you can also use IP addresses (`http://192.168.0.1`). Ensure to append the port, for example, `http://my.broker.client:8000`.
The `https` webhook calls requires additional setup in the Broker client to bring a TLS certificate and mount it into the container. For details, see [HTTPS for Broker client with Docker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/https-for-broker-client-with-docker).
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/broker-inbound-and-outbound-connections-and-allowed-requests.md
# Broker inbound and outbound connections and allowed requests
This page provides details about the connections between Snyk and the Broker Client and allowed requests.
## Using inbound and outbound connections with Snyk Broker
The Broker Client runs within your internal network.
### Inbound connection from Snyk to the Broker Client
There is no direct inbound connection from Snyk to the Broker Client.
The Broker Client makes an outbound connection to . This establishes a WebSocket connection, allowing communication with the Broker Server.
Thus you do not need to allow a Snyk IP address. Instead, you can allow the Broker Client IP/port.
### Outbound connection from the Broker Client
The Broker Client initiates the outbound connection to establish the WebSocket.
After the WebSocket connection initiated by the Broker Client is established, Snyk can send inbound requests to the Broker Client through the WebSocket.
Thus you do not need to allow inbound connections to the Broker Client from Snyk-specific IP addresses or other external IP addresses.
## Allowed requests
### Approved data list for Snyk Broker
The Broker Client maintains an approved data list for inbound and outbound requests. Only data on this approved list may be requested. This narrows the access permissions to the absolute minimum required for Snyk to monitor a repository.
### Inbound requests allowed
For Snyk Open Source, the following requests are allowed:
* Snyk.io is allowed to fetch and view only dependency manifest files and the `.snyk` policy file.
* No other source code is viewed, extracted, or modified.
* You may check in additional `.snyk` files to support the Snyk patch mechanism and for any ignore instructions that are included in your vulnerability policy.
Snyk Code needs access to the entire repository.
See [How Snyk handles your data](https://docs.snyk.io/snyk-data-and-governance/how-snyk-handles-your-data) for more details.
### Outbound requests allowed
When you configure your Broker Client setup, Git repository webhooks are set to enable automatic Snyk scans, triggered when your developers submit new pull requests or merge events.
Webhook notifications are delivered to Snyk through the Broker Client only for events relevant to Snyk actions: push to branch and open pull request, and only when the event data also shows a scan has occurred. For example, for Open Source, the event data must include a dependency manifest file or a `.snyk` policy file.
### Default approved data list and `accept.json` file
On occasion, you may need to [add and configure an `accept.json`](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/snyk-broker-infrastructure-as-code-detection) file in your Broker deployment. Doing this will remove the ability to apply ACCEPT rules when starting the Broker.
---
# Source: https://docs.snyk.io/manage-risk/reporting/reporting-and-bi-integrations-snowflake-data-share/build-your-first-dashboard.md
# Build your first dashboard
This guide provides example queries to build out your first AppSec dashboard, based on key performance indicators (KPIs) and relevant use cases. These are organized by use case and explained in terms of business value and implementation considerations. While the provided queries offer a starting point, Snyk encourages you to customize them to suit your specific requirements.
See queries for the following use cases:
* [Open issues backlog](#open-issues-backlog) - reveal current AppSec risk that requires attention.
* [Aging](#aging) - track the exposure window of open issues.
* [Mean Time To Resolve (MTTR)](#mttr) - analyze the remediation velocity of engineering teams.
* [Service Level Agreement (SLA) ](#sla)- verify that issue remediation meets with your compliance requirements.
* [IDE & CLI test rates](#developers-ide-and-cli-test-usage-and-adoption) - measure the developer adoption of AppSec testing in the development stage.
* [CI/CD Pipelines test rates](#ci-cd-pipelines-test-usage-and-adoption) - measure the adoption of AppSec testing in CI/CD pipelines.
{% hint style="warning" %}
You must update the database and schema names in the example queries provided before execution.
{% endhint %}
## Open issues backlog
### Business value
AppSec teams need to understand the current exposure to risk. To do so, various aspects of the existing issues backlog are examined:
* The number of open issues.
* The number of high or critical-severity open issues.
* If there are available fixes for open issues.
For greater context, those figures are broken down into engineering teams, applications or any meaningful business structure that will make the results more concise and actionable. The following example queries allow this examination.
### Example query - SCA
This query returns open SCA issues backlog counters, distributed by fixability and grouped by Snyk Organization.
The results are based on:
* Open high or critical issues that were found by Snyk Open Source (SCA)
* Noise cancelling:
* Only issues of monitored projects
* No deleted issues
```sql
SELECT o.DISPLAY_NAME AS organization_display_name, -- Update based on the desired aggregation
COUNT_IF(ISSUE_SEVERITY='Critical' AND COMPUTED_FIXABILITY='Fixable') AS fixable_critical_issues,
COUNT_IF(ISSUE_SEVERITY='High'AND COMPUTED_FIXABILITY='Fixable') AS fixable_high_issues,
COUNT_IF(ISSUE_SEVERITY='Critical' AND COMPUTED_FIXABILITY='Partially Fixable') AS partially_fixable_critical_issues,
COUNT_IF(ISSUE_SEVERITY='High'AND COMPUTED_FIXABILITY='Partially Fixable') AS partially_fixable_high_issues,
COUNT_IF(ISSUE_SEVERITY='Critical' AND COMPUTED_FIXABILITY='No Fix Supported') AS unfixable_critical_issues,
COUNT_IF(ISSUE_SEVERITY='High'AND COMPUTED_FIXABILITY='No Fix Supported') AS unfixable_high_issues
FROM SNYK.SNYK.ISSUES__V_1_0 i
INNER JOIN SNYK.SNYK.PROJECTS__V_1_0 p ON i.PROJECT_PUBLIC_ID = p.PUBLIC_ID
INNER JOIN SNYK.SNYK.ORGS__V_1_0 o ON i.ORG_PUBLIC_ID = o.PUBLIC_ID
WHERE p.IS_MONITORED = TRUE -- include only monitored projects
AND i.DELETED_AT IS NULL -- remove deleted issues
AND ISSUE_STATUS = 'Open' -- include only open issues
AND i.PRODUCT_NAME = 'Snyk Open Source' -- include only Snyk Open Source
GROUP BY o.DISPLAY_NAME -- Update based on the desired aggregation
ORDER BY fixable_critical_issues DESC, fixable_high_issues DESC,
partially_fixable_critical_issues DESC, partially_fixable_high_issues DESC; -- Update based on the desired order
```
#### **Output format:**
Output of SQL query for SCA issues backlog counters
### Example query - Code
This query returns open Snyk Code issues backlog counters, distributed by severity and grouped by Snyk Organization.
The results are based on:
* Open issues that were found by Snyk Code
* Noise cancelling:
* Only issues of monitored projects
* No deleted issues
```sql
SELECT o.DISPLAY_NAME AS organization_display_name, -- Update based on the desired aggregation
COUNT_IF(ISSUE_SEVERITY='High') AS high_issues,
COUNT_IF(ISSUE_SEVERITY='Medium') AS medium_issues,
COUNT_IF(ISSUE_SEVERITY='Low') AS low_issues
FROM SNYK.SNYK.ISSUES__V_1_0 i
INNER JOIN SNYK.SNYK.PROJECTS__V_1_0 p ON i.PROJECT_PUBLIC_ID = p.PUBLIC_ID
INNER JOIN SNYK.SNYK.ORGS__V_1_0 o ON i.ORG_PUBLIC_ID = o.PUBLIC_ID
WHERE p.IS_MONITORED = TRUE -- include only monitored projects
AND i.DELETED_AT IS NULL -- remove deleted issues
AND ISSUE_STATUS = 'Open' -- include only open issues
AND i.PRODUCT_NAME = 'Snyk Code' -- include only Snyk Open Source
GROUP BY o.DISPLAY_NAME -- Update based on the desired aggregation
ORDER BY high_issues DESC,
medium_issues DESC,
low_issues DESC; -- Update based on the desired order
```
#### **Output format:**
Output format of SQL query for open Snyk Code issues backlog counters
## Aging
### Business value
Issue aging refers to the time elapsed between an issue’s introduction and the current date. Organizations are concerned about this metric as the exploitation likelihood increases as the exposure window extends.
To mitigate this risk, AppSec teams monitor a predefined SLA criteria, which specifies when an issue has exceeded the expected remediation timeframe.
{% hint style="info" %}
When an issue was reintroduced, the aging is counted based on the last introduction date.
{% endhint %}
### Example query
The query below returns the average aging (in days) of critical issues per Snyk organization.\
The results are based on:
* Open critical issues
* Noise cancelling:
* Only issues of monitored projects
* No deleted issues
```sql
SELECT o.DISPLAY_NAME AS organization_display_name,
ROUND(AVG(
CASE
WHEN LAST_INTRODUCED IS NULL THEN DATEDIFF('DAY', TO_DATE(FIRST_INTRODUCED), CURRENT_DATE)
WHEN TO_DATE(FIRST_INTRODUCED) <= TO_DATE(LAST_INTRODUCED) THEN DATEDIFF('DAY', TO_DATE(LAST_INTRODUCED), CURRENT_DATE)
END),0) AS open_issues_aging
FROM SNYK.SNYK.ISSUES__V_1_0 i
INNER JOIN SNYK.SNYK.PROJECTS__V_1_0 p ON i.PROJECT_PUBLIC_ID = p.PUBLIC_ID
INNER JOIN SNYK.SNYK.ORGS__V_1_0 o ON i.ORG_PUBLIC_ID = o.PUBLIC_ID
WHERE p.IS_MONITORED = TRUE -- include only monitored projects
AND i.DELETED_AT IS NULL -- remove deleted issues
AND ISSUE_STATUS = 'Open' -- include only open issues
AND ISSUE_SEVERITY IN ('Critical') -- include only critical issues
GROUP BY o.DISPLAY_NAME -- Update based on the desired aggregation
ORDER BY open_issues_aging DESC; -- Update based on the desired order
```
#### **Output format:**
Output format of SQL query for average aging of critical issues
## MTTR
### Business value
The MTTR (Mean Time to Resolve) metric tracks the average time it takes to resolve a security issue. It is calculated based on issues that have already been resolved and is measured over a predefined period (typically monthly, quarterly, or annually) according to their last resolution date.
Analyzing the MTTR results provides insight into the the remediation velocity of engineering teams. However, it is important to always measure both MTTR and Aging, as issues that remain open for long periods won’t show up in the MTTR results until they are remediated.
### Example query
The query below returns last month’s MTTR results per issue severity per Snyk organization.\
The results are based on:
* Issues that were resolved in the last month
* Noise cancelling:
* Only issues of monitored projects
* No deleted issues
```sql
SELECT
o.DISPLAY_NAME AS organization_display_name,
ROUND(AVG(CASE WHEN ISSUE_SEVERITY = 'Critical' THEN
DATEDIFF('DAY', TO_DATE(LAST_INTRODUCED),TO_DATE(LAST_RESOLVED)) ELSE NULL END),2) AS critical_mttr,
ROUND(AVG(CASE WHEN ISSUE_SEVERITY = 'High' THEN
DATEDIFF('DAY', TO_DATE(LAST_INTRODUCED),TO_DATE(LAST_RESOLVED)) ELSE NULL END),2) AS high_mttr,
ROUND(AVG(CASE WHEN ISSUE_SEVERITY = 'Medium' THEN
DATEDIFF('DAY', TO_DATE(LAST_INTRODUCED),TO_DATE(LAST_RESOLVED)) ELSE NULL END),2) AS medium_mttr,
ROUND(AVG(CASE WHEN ISSUE_SEVERITY = 'Low' THEN
DATEDIFF('DAY', TO_DATE(LAST_INTRODUCED),TO_DATE(LAST_RESOLVED)) ELSE NULL END),2) AS low_mttr
FROM SNYK.SNYK.ISSUES__V_1_0 i
INNER JOIN SNYK.SNYK.PROJECTS__V_1_0 p ON i.PROJECT_PUBLIC_ID = p.PUBLIC_ID
INNER JOIN SNYK.SNYK.ORGS__V_1_0 o ON i.ORG_PUBLIC_ID = o.PUBLIC_ID
WHERE IS_MONITORED = TRUE -- include only monitored projects
AND i.DELETED_AT IS NULL -- remove deleted issues
AND ISSUE_STATUS = 'Resolved' -- include only resolved issues
-- issues that were resolved in the last month
AND TO_DATE(LAST_RESOLVED) >= DATE_TRUNC('MONTH', DATEADD('MONTH', -12, CURRENT_DATE))
AND TO_DATE(LAST_RESOLVED) <= DATEADD('DAY', -1, DATE_TRUNC('MONTH', CURRENT_DATE))
GROUP BY organization_display_name
ORDER BY organization_display_name ASC; -- Update based on the desired order
```
#### **Output format:**
Output format of SQL query for MTTR per issue severity
## SLA
### Business value
Remediating vulnerabilities is a crucial practice, however it slows down product development that drives companies' business. Due to this simple fact, engineering teams may neglect open vulnerabilities in favor of product development tasks.
Establishing a service-level agreement (SLA) for vulnerability remediation helps maintaining that fine balance and ensure that while moving forward with product development, evolving security risks are being addressed according to a clear and transparent policy.
SLA targets define the acceptable exposure window for a vulnerability based on factors such as severity, business criticality of the asset, code ownership, or other risk factors.
Snyk issues data enables AppSec teams to track issue aging and identify which vulnerabilities have exceeded SLA targets.
### Example query
The query below returns counters of open issues per SLA status (within SLA, at risk, breached) broken down into issue severities.
The results are based on:
* Open issues
* Noise cancelling:
* Only issues of monitored projects
* No deleted issues
```sql
WITH
base AS (
SELECT
CASE
WHEN LAST_INTRODUCED IS NULL THEN DATEDIFF('DAY', TO_DATE(FIRST_INTRODUCED), CURRENT_DATE)
WHEN TO_DATE(FIRST_INTRODUCED) <= TO_DATE(LAST_INTRODUCED) THEN DATEDIFF('DAY', TO_DATE(LAST_INTRODUCED), CURRENT_DATE)
END AS ISSUE_AGE,
ISSUE_SEVERITY,
CASE
WHEN ISSUE_SEVERITY = 'Critical' AND ISSUE_AGE > c.CRITICAL THEN 'Breached'
WHEN ISSUE_SEVERITY = 'Critical' AND ISSUE_AGE >= (c.CRITICAL-c.CRITICAL_AT_RISK) THEN 'At Risk'
WHEN ISSUE_SEVERITY = 'Critical' AND ISSUE_AGE < (c.CRITICAL-c.CRITICAL_AT_RISK) THEN 'Within SLA'
WHEN ISSUE_SEVERITY = 'High' AND ISSUE_AGE > h.HIGH THEN 'Breached'
WHEN ISSUE_SEVERITY = 'High' AND ISSUE_AGE >= (h.HIGH-h.HIGH_AT_RISK) THEN 'At Risk'
WHEN ISSUE_SEVERITY = 'High' AND ISSUE_AGE < (h.HIGH-h.HIGH_AT_RISK) THEN 'Within SLA'
WHEN ISSUE_SEVERITY = 'Medium' AND ISSUE_AGE > m.MEDIUM THEN 'Breached'
WHEN ISSUE_SEVERITY = 'Medium' AND ISSUE_AGE >= (m.MEDIUM-m.MEDIUM_AT_RISK) THEN 'At Risk'
WHEN ISSUE_SEVERITY = 'Medium' AND ISSUE_AGE < (m.MEDIUM-m.MEDIUM_AT_RISK) THEN 'Within SLA'
WHEN ISSUE_SEVERITY = 'Low' AND ISSUE_AGE > l.LOW THEN 'Breached'
WHEN ISSUE_SEVERITY = 'Low' AND ISSUE_AGE >= (l.LOW-l.LOW_AT_RISK) THEN 'At Risk'
WHEN ISSUE_SEVERITY = 'Low' AND ISSUE_AGE < (l.LOW-l.LOW_AT_RISK) THEN 'Within SLA'
END AS SLA_STATUS
FROM SNYK.SNYK.ISSUES__V_1_0 i
INNER JOIN SNYK.SNYK.PROJECTS__V_1_0 p ON i.project_public_id = p.public_id
-- set the SLA TARGETS and AT RISK threshold inside the select clause of each table below
CROSS JOIN (SELECT 15 AS CRITICAL, 3 AS CRITICAL_AT_RISK) AS c
CROSS JOIN (SELECT 30 AS HIGH, 10 AS HIGH_AT_RISK) AS h
CROSS JOIN (SELECT 90 AS MEDIUM, 20 AS MEDIUM_AT_RISK) AS m
CROSS JOIN (SELECT 180 AS LOW, 30 AS LOW_AT_RISK) AS l
WHERE IS_MONITORED = true -- include only monitored projects
AND ISSUE_STATUS = 'Open' -- include only open issues
AND i.DELETED_AT IS NULL -- remove deleted issues
)
SELECT
SLA_STATUS,
SUM(CASE WHEN ISSUE_SEVERITY = 'Critical' THEN 1 ELSE 0 END) AS critical,
SUM(CASE WHEN ISSUE_SEVERITY = 'High' THEN 1 ELSE 0 END) AS high,
SUM(CASE WHEN ISSUE_SEVERITY = 'Medium' THEN 1 ELSE 0 END) AS medium,
SUM(CASE WHEN ISSUE_SEVERITY = 'Low' THEN 1 ELSE 0 END) AS low
FROM base
GROUP BY SLA_STATUS
ORDER BY SLA_STATUS
```
{% hint style="info" %}
The example query can be extended to support various SLA use-cases, such as defining different SLA targets per Snyk orgs or groups, drilling-down into the at-risk or breached issues and prioritize their remediation or analyzing the SLA status for different business units.
{% endhint %}
#### **Output format:**
Output format of SQL query for open issues counter per SLA status
## Developers IDE & CLI test usage and adoption
### Business value
This section demonstrate how you can discover the adoption of Snyk IDE & CLI tests by your developers. Implementing AppSec testing during the development phase is regarded as one of the most cost-effective methods for preventing new security risks from reaching production. It is more efficient because developers are already in the right context to address issues before the code progresses further in the SDLC. Detecting issues in later stages requires developers to switch context and revisit the problem, which can be less efficient and more time-consuming.
### Example query
The query below returns the unique developers and total number of scans per environment and Snyk Product.
The results are based on:
* Tests executed
* Excluding tests that were performed during CI/CD stage
```sql
SELECT
ENVIRONMENT_DISPLAY_NAME AS IDE,
PRODUCT_DISPLAY_NAME AS PRODUCT,
COUNT(DISTINCT USER_EMAIL) AS UNIQUE_DEVELOPERS,
COUNT(1) AS TOTAL_SCANS
from SNYK.SNYK.USAGE_EVENTS__V_1_0
WHERE (RUNTIME_APPLICATION_DATA_SCHEMA_VERSION = 'v2'
AND ARRAY_CONTAINS('test'::VARIANT, INTERACTION_CATEGORIES)
AND INTERACTION_STAGE IN ('dev')
OR RUNTIME_APPLICATION_DATA_SCHEMA_VERSION = 'v1'
)
GROUP BY IDE, PRODUCT
```
#### **Output format:**
Output format of SQL query for number of scans per Snyk environment
## CI/CD pipelines test usage and adoption
### Business value
Preventing vulnerabilities from reaching production involves placing security gates throughout the software development lifecycle (SDLC). One of the most common gates is within the CI/CD pipeline, ensuring that any vulnerabilities missed in earlier stages are caught and blocked during the build process.
Leveraging Snyk Data Share enables you to assess the current adoption of tests and security gates within your CI/CD pipelines.
### Example query
The query below returns the number of tested repositories, total tests, and the test % success rate per Snyk Product.
The results are based on tests executed in the CI/CD stage in the last 3 months.
```sql
SELECT
PRODUCT_DISPLAY_NAME AS PRODUCT,
COUNT(DISTINCT INTERACTION_TARGET_ID) AS "TESTED REPOS",
COUNT(1) AS "TOTAL SCANS",
ROUND(((SUM(CASE WHEN INTERACTION_EXIT_CODE=0 THEN 1 ELSE 0 END))/
(NULLIF(SUM(CASE WHEN INTERACTION_EXIT_CODE IN (0,1) THEN 1 ELSE 0 END),0))
*100),0) AS "SUCCESS RATE"
FROM SNYK.SNYK.USAGE_EVENTS__V_1_0
WHERE INTERACTION_STAGE != 'cicd'
AND ARRAY_CONTAINS('test'::VARIANT, INTERACTION_CATEGORIES)
AND INTERACTION_TIMESTAMP >= DATE_TRUNC('MONTH', DATEADD('MONTH', -3, CURRENT_DATE))
GROUP BY PRODUCT
```
#### **Output format:**
Output format of SQL query for number of tested repositories, total tests, and the test % success rate per Snyk Product
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-iac/current-iac-custom-rules/writing-rules-using-the-sdk/bundling-rules.md
# Bundling rules
When you are ready, you can **build a custom rules bundle** by running the following command:
```
snyk-iac-rules build
```
If you have more than your generated rules in the current folder, consider using the `--ignore` option to exclude the folders and files irrelevant to a production-ready bundle. This can both speed up the process and ensure the size of the generated bundle stays small.
You can override the default entry point. If you have chosen to name the rule that evaluates something different from **`deny`**, for example, `allow`,`violation`and so on, you can override it by running:
```
snyk-iac-rules build --entrypoint "/"
```
Finally, you can check the contents of the bundle without extracting it by running:
```
tar -tf bundle.tar.gz
```
The output will be all the files included in the bundle:
```
/data.json
/lib/main.rego
/rules/MY_RULE/main.rego
/policy.wasm
/.manifest
```
You can now [run snyk iac test with your newly built custom bundle.](https://docs.snyk.io/scan-with-snyk/snyk-iac/current-iac-custom-rules/use-iac-custom-rules-with-cli)
---
# Source: https://docs.snyk.io/supported-languages/supported-languages-list/c-c++/c++-for-code-analysis.md
# C/C++ for code analysis
Refer to the [C/C++ details ](https://docs.snyk.io/supported-languages/supported-languages-list/c-c++)for supported frameworks, libraries, and features.
**Import your app through SCM**: Available\
**Test or monitor your app through CLI and IDE**: Available\
**Features**: Reports\
**Operating systems:** Linux, Windows (limited)\
**Embedded systems:** Linux\
**IDE:** No additional options are required. The Snyk plugin has views within the IDE for displaying results.\
**Feature:** Interfile analysis
{% hint style="warning" %}
If you use macros, your results may include false positives and false negatives.
{% endhint %}
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-code/snyk-code-security-rules/c-and-asp.net-rules.md
# C# and ASP.NET rules
Each rule includes the following information.
* **Rule Name**: The Snyk name of the rule.
* **CWE(s):** The [CWE numbers](https://cwe.mitre.org/) that are covered by this rule.
* **Security Categories**: The [OWASP Top 10 ](https://owasp.org/Top10/)(2021 edition) category to which the rule belongs to, if any, and if it is included in [SANS 25](https://www.sans.org/top25-software-errors/).
* **Autofixable**: Security rules that are autofixable by Snyk Agent Fix. This information is included only for the supported programming languages.
| Rule Name | CWE(s) | Security Categories | Autofixable |
| ----------------------------------------------------------------- | ---------------- | ---------------------- | ----------- |
| Anti-forgery token validation disabled | CWE-352 | Sans Top 25, OWASP:A01 | Yes |
| Debug Features Enabled | CWE-215 | None | Yes |
| Usage of BinaryFormatter | CWE-502 | Sans Top 25, OWASP:A08 | No |
| Cleartext Storage of Sensitive Information in a Cookie | CWE-315 | OWASP:A05 | No |
| Code Injection | CWE-94 | Sans Top 25, OWASP:A03 | No |
| Command Injection | CWE-78 | Sans Top 25, OWASP:A03 | No |
| Deserialization of Untrusted Data | CWE-502 | Sans Top 25, OWASP:A08 | Yes |
| Hardcoded Secret | CWE-547 | OWASP:A05 | No |
| Improper Neutralization of CRLF Sequences in HTTP Headers | CWE-113 | OWASP:A03 | No |
| Use of a Broken or Risky Cryptographic Algorithm | CWE-327 | OWASP:A02 | No |
| Use of Password Hash With Insufficient Computational Effort | CWE-916 | OWASP:A02 | Yes |
| Use of Insufficiently Random Values | CWE-330 | OWASP:A02 | No |
| Insecure Data Transmission | CWE-319 | OWASP:A02 | Yes |
| LDAP Injection | CWE-90 | OWASP:A03 | No |
| Log Forging | CWE-117 | OWASP:A09 | Yes |
| Use of Hardcoded Credentials | CWE-798 | Sans Top 25, OWASP:A07 | Yes |
| Open Redirect | CWE-601 | OWASP:A01 | No |
| Path Traversal | CWE-23 | OWASP:A01 | No |
| Exposure of Private Personal Information to an Unauthorized Actor | CWE-359 | OWASP:A01 | Yes |
| Regular expression injection | CWE-400, CWE-730 | None | No |
| Request Validation Disabled | CWE-554 | None | Yes |
| Information Exposure | CWE-200 | OWASP:A01 | No |
| SQL Injection | CWE-89 | Sans Top 25, OWASP:A03 | No |
| Server-Side Request Forgery (SSRF) | CWE-918 | Sans Top 25, OWASP:A10 | No |
| Inadequate Encryption Strength | CWE-326 | OWASP:A02 | No |
| Sensitive Cookie Without 'HttpOnly' Flag | CWE-1004 | OWASP:A05 | Yes |
| Sensitive Cookie in HTTPS Session Without 'Secure' Attribute | CWE-614 | OWASP:A05 | Yes |
| Cross-site Scripting (XSS) | CWE-79 | Sans Top 25, OWASP:A03 | No |
| XML External Entity (XXE) Injection | CWE-611 | OWASP:A05 | No |
| XAML Injection | CWE-611 | OWASP:A05 | No |
| XML Injection | CWE-91 | OWASP:A03 | No |
| XPath Injection | CWE-643 | OWASP:A03 | No |
| Arbitrary File Write via Archive Extraction (Zip Slip) | CWE-22 | Sans Top 25, OWASP:A01 | No |
---
# Source: https://docs.snyk.io/supported-languages/supported-languages-list/c-c++/c-c++-for-open-source.md
# C/C++ for open source
Refer to the [C/C++ details ](https://docs.snyk.io/supported-languages/supported-languages-list/c-c++)for supported package managers and features.
## Open source dependency management
Snyk features that support the management of open-source dependencies include the following:
Package managers / Features CLI support SCM support License scanning Fix PRs C/C++ ✔︎ ✔︎
For information about managing dependencies and licenses from your developer workflows through policy, see the following
* [Defining a secure open source policy](https://snyk.io/series/open-source-security/open-source-policy/)
* [Use Snyk security policies to prioritize fixes more efficiently](https://snyk.io/blog/snyk-security-policies/)
## Open source license compliance
To check compliance for open source licenses, see [Snyk License Compliance Management.](https://docs.snyk.io/scan-with-snyk/snyk-open-source/scan-open-source-libraries-and-licenses/snyk-license-compliance-management)
## IDE for C++ for open-source dependencies
IDE Under **Additional Parameters** in the IDE settings, enter the `--unmanaged` option to scan for C/C++ open source dependencies. Scan for dependencies">
---
# Source: https://docs.snyk.io/scan-with-snyk/snyk-code/snyk-code-security-rules/c-c++-rules.md
# C/C++ rules
Each rule includes the following information.
* **Rule Name**: The Snyk name of the rule.
* **CWE(s):** The [CWE numbers](https://cwe.mitre.org/) that are covered by this rule.
* **Security Categories**: The [OWASP Top 10 ](https://owasp.org/Top10/)(2021 edition) category to which the rule belongs to, if any, and if it is included in [SANS 25](https://www.sans.org/top25-software-errors/).
* **Autofixable**: Security rules that are autofixable by Snyk Agent Fix. This information is included only for the supported programming languages.
| Rule Name | CWE(s) | Security Categories | Autofixable |
| ------------------------------------------------------------------------ | ---------------- | ---------------------- | ----------- |
| Memory Allocation Of String Length | CWE-170 | None | Yes |
| Insecure Anonymous LDAP Binding | CWE-287 | Sans Top 25, OWASP:A07 | No |
| Buffer Overflow | CWE-122 | None | Yes |
| Division By Zero | CWE-369 | None | No |
| Missing Release of File Descriptor or Handle after Effective Lifetime | CWE-775 | None | Yes |
| Command Injection | CWE-78 | Sans Top 25, OWASP:A03 | No |
| Dereference of a NULL Pointer | CWE-476 | Sans Top 25 | No |
| Double Free | CWE-415 | None | Yes |
| Use of Externally-Controlled Format String | CWE-134 | None | Yes |
| Use of Hardcoded Cryptographic Key | CWE-321 | OWASP:A02 | No |
| Improper Null Termination | CWE-170 | None | No |
| Use of Password Hash With Insufficient Computational Effort | CWE-916 | OWASP:A02 | Yes |
| Integer Overflow | CWE-190 | Sans Top 25 | No |
| LDAP Injection | CWE-90 | OWASP:A03 | No |
| Missing Release of Memory after Effective Lifetime | CWE-401 | None | Yes |
| An optimizing compiler may remove memset non-zero leaving data in memory | CWE-1330 | None | No |
| Potential Negative Number Used as Index | CWE-125, CWE-787 | Sans Top 25 | No |
| Path Traversal | CWE-23 | OWASP:A01 | No |
| Exposure of Private Personal Information to an Unauthorized Actor | CWE-359 | OWASP:A01 | No |
| Size Used as Index | CWE-125, CWE-787 | Sans Top 25 | Yes |
| SQL Injection | CWE-89 | Sans Top 25, OWASP:A03 | No |
| Server-Side Request Forgery (SSRF) | CWE-918 | Sans Top 25, OWASP:A10 | No |
| Inadequate Encryption Strength | CWE-326 | OWASP:A02 | Yes |
| Potential buffer overflow from usage of unsafe function | CWE-122 | None | Yes |
| Use of Expired File Descriptor | CWE-910 | None | No |
| Use After Free | CWE-416 | Sans Top 25 | No |
| User Controlled Pointer | CWE-1285 | None | No |
| Authentication Bypass by Spoofing | CWE-290 | OWASP:A07 | No |
| Cross-site Scripting (XSS) | CWE-79 | Sans Top 25, OWASP:A03 | No |
| XML External Entity (XXE) Injection | CWE-611 | OWASP:A05 | No |
| XPath Injection | CWE-643 | OWASP:A03 | No |
---
# Source: https://docs.snyk.io/supported-languages/supported-languages-list/c-c++.md
# C/C++
## Applicability
C/C++ is supported for Snyk Open Source and Snyk Code.
Specific considerations apply for the [Snyk CLI for open-source C++ scans](https://docs.snyk.io/supported-languages/supported-languages-list/c-c++/snyk-cli-for-open-source-c++-scans). [Guidance for Snyk for C/C++](https://docs.snyk.io/supported-languages/supported-languages-list/c-c++/guidance-for-snyk-for-c-c++) is provided.
The following functions are available for C/C++:
* SCM import - available only for Snyk Code.
* Test or monitor your app through CLI and IDE
* Test your app's SBOM using `pkg:generic` or `pkg:conan`. For more information, see [Test an SBOM document for vulnerabilities](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/sbom-apis/rest-api-endpoint-test-an-sbom-document-for-vulnerabilities).
* Test your app's packages using `pkg:generic` or `pkg:conan`. For more information, see [List issues for a package](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/issues-list-issues-for-a-package).
## Package managers and supported file extensions
For Conan, Snyk supports [conan.io](https://conan.io/center) as a package registry.
No additional options are required for the Snyk IDE. You can display results within the IDE using the Snyk plugin views.
For C/C++ for Snyk Code, Snyk supports the following file formats: `.c`, `.cc`, `.cpp`, `.cxx`, `.h`, `.hpp`, `.hxx`.
## Frameworks and libraries
For C/C++, Snyk supports the following frameworks and libraries:
* argparse parser
* Asio Library
* Boost Library
* Botan LIbrary
* C Standard Library
* C++ Standard Library
* Curl library
* fstream framework
* grpc-cpp library
* HTTPlib framework
* JsonCpp library
* liboai framework
* libpq library
* libpqxx framework
* libsodium library
* LibTomCrypt framework
* libxml2 framework
* MySQL framework
* OpenSSL framework
* POSIX LIbrary
* pugixml library
* SQLite library
* WinHTTP framework
* Xerces libraries
## Features
For C/C++, Snyk supports the following features:
| Snyk Open Source | Snyk Code |
| -------------------------------------------------- | ------------------ |
| | Interfile analysis |
{% hint style="info" %}
The **Snyk FixPR** feature is not available for C/C++. This means that you will not be notified if the PR checks fail when the following conditions are met
* The **PR checks** feature is enabled and configured to **Only fail when the issues found have a fix available.**
* "**Fixed in" available** is set to **Yes.**
{% endhint %}
---
# Source: https://docs.snyk.io/snyk-api/api-end-of-life-eol-process-and-migration-guides/candidates-for-upcoming-api-end-of-life-cadences.md
# Candidates for upcoming API end-of-life cadences
This page contains the list of V1, non-GA REST, and older versions of GA REST endpoints that are potential candidates for future end-of-life cadences. To see which endpoints are end-of-life'd, see [API EOL endpoints and key dates](https://docs.snyk.io/snyk-api/api-end-of-life-eol-process-and-migration-guides/api-eol-endpoints-and-key-dates).
If an endpoint is listed in the table below, it does not guarantee that the endpoint will be included in the next end-of-life cadence. In addition, an endpoint can go straight into the next end-of-life cadence without having to be listed in the table below.
The endpoints in this list, and in any end-of-life cadence, must adhere to our end-of-life criteria, which require that the endpoints have:
* A GA REST equivalent or equivalents (except in the rare case where a V1 API does not have or need a GA REST equivalent)
* Functionality parity between V1 and GA REST (unless explicitly stated otherwise in the migration guide)
* A migration guide by our field specialists for ease of migration
Endpoints that are end-of-life'd do not appear in the list that follows, as the list is for future candidates for end-of-life, and can be found on the [API EOL endpoints and key dates](https://docs.snyk.io/snyk-api/api-end-of-life-eol-process-and-migration-guides/api-eol-endpoints-and-key-dates) page.
| Endpoint | Endpoint Type | Migration Guide | Date Added |
| -------- | ------------- | --------------- | ---------- |
| | | | |
| | | | |
| | | | |
---
# Source: https://docs.snyk.io/snyk-api/changelog.md
# Changelog
### 2025-11-05 - Updated 2026-01-09
#### GET - `/orgs/{org_id}/projects/{project_id}/sbom` - Updated
* added the new optional `query` request parameter `go-module-level`
### 2025-11-05 - Updated 2025-12-04
#### DELETE - `/tenants/{tenant_id}/brokers/connections/{connection_id}/orgs/{org_id}/integrations/{integration_id}` - Updated
* added the non-success response with the status `403`
#### POST - `/tenants/{tenant_id}/brokers/connections/{connection_id}/orgs/{org_id}/integration` - Updated
* added the non-success response with the status `403`
### 2025-11-05
#### GET - `/orgs/{org_id}/packages/{purl}/issues` - Updated
* for the `query` request parameter `limit`, the type/format was changed from `number`/ ``to `integer`/`` 
* for the `query` request parameter `offset`, the type/format was changed from `number`/ ``to `integer`/`` 
* for the `query` request parameter `limit`, the max was set to `1000.00` 
* for the `query` request parameter `limit`, the min was set to `1.00` 
* for the `query` request parameter `offset`, the min was set to `0.00` 
* api operation id `fetchIssuesPerPurl` removed and replaced with `getIssuesPerPurl`
* added the optional property `meta/match` to the response with the `200` status
#### POST - `/orgs/{org_id}/packages/issues` - Updated
* added the optional property `meta/packages` to the response with the `200` status
### 2025-09-28 - Updated 2025-11-13
#### POST - `/orgs/{org_id}/export` - Updated
* added the new optional request property `data/attributes/filters/empty_project_tags`
#### POST - `/groups/{group_id}/export` - Updated
* added the new optional request property `data/attributes/filters/empty_project_tags`
### 2025-09-28 - Updated 2025-11-13
#### GET - `/orgs/{org_id}/issues` - Updated
* added the optional property `data/items/attributes/coordinates/items/created_at` to the response with the `200` status
* added the optional property `data/items/attributes/coordinates/items/last_introduced_at` to the response with the `200` status
* added the optional property `data/items/attributes/coordinates/items/last_resolved_at` to the response with the `200` status
* added the optional property `data/items/attributes/coordinates/items/last_resolved_details` to the response with the `200` status
* added the optional property `data/items/attributes/coordinates/items/state` to the response with the `200` status
* added the optional property `data/items/attributes/coordinates/items/updated_at` to the response with the `200` status
#### GET - `/orgs/{org_id}/issues/{issue_id}` - Updated
* added the optional property `data/attributes/coordinates/items/created_at` to the response with the `200` status
* added the optional property `data/attributes/coordinates/items/last_introduced_at` to the response with the `200` status
* added the optional property `data/attributes/coordinates/items/last_resolved_at` to the response with the `200` status
* added the optional property `data/attributes/coordinates/items/last_resolved_details` to the response with the `200` status
* added the optional property `data/attributes/coordinates/items/state` to the response with the `200` status
* added the optional property `data/attributes/coordinates/items/updated_at` to the response with the `200` status
#### GET - `/groups/{group_id}/issues` - Updated
* added the optional property `data/items/attributes/coordinates/items/created_at` to the response with the `200` status
* added the optional property `data/items/attributes/coordinates/items/last_introduced_at` to the response with the `200` status
* added the optional property `data/items/attributes/coordinates/items/last_resolved_at` to the response with the `200` status
* added the optional property `data/items/attributes/coordinates/items/last_resolved_details` to the response with the `200` status
* added the optional property `data/items/attributes/coordinates/items/state` to the response with the `200` status
* added the optional property `data/items/attributes/coordinates/items/updated_at` to the response with the `200` status
#### GET - `/groups/{group_id}/issues/{issue_id}` - Updated
* added the optional property `data/attributes/coordinates/items/created_at` to the response with the `200` status
* added the optional property `data/attributes/coordinates/items/last_introduced_at` to the response with the `200` status
* added the optional property `data/attributes/coordinates/items/last_resolved_at` to the response with the `200` status
* added the optional property `data/attributes/coordinates/items/last_resolved_details` to the response with the `200` status
* added the optional property `data/attributes/coordinates/items/state` to the response with the `200` status
* added the optional property `data/attributes/coordinates/items/updated_at` to the response with the `200` status
### 2025-09-28 - Updated 2025-10-22
#### POST - `/orgs/{org_id}/export` - Updated
* added the new optional request property `data/attributes/url_expiration_seconds`
#### POST - `/groups/{group_id}/export` - Updated
* added the new optional request property `data/attributes/url_expiration_seconds`
## Changelog
#### 2025-09-28 - Updated 2025-10-14
**GET - `/orgs/{org_id}/issues` - Updated**
* the `data/items/attributes/coordinates/items/representations/items/oneOf[subschema #3]/cloud_resource/resource/name` response property`s maxLength was unset from` 256`for the response status`200\` 
**GET - `/orgs/{org_id}/issues/{issue_id}` - Updated**
* the `data/attributes/coordinates/items/representations/items/oneOf[subschema #3]/cloud_resource/resource/name` response property`s maxLength was unset from` 256`for the response status`200\` 
**GET - `/groups/{group_id}/issues` - Updated**
* the `data/items/attributes/coordinates/items/representations/items/oneOf[subschema #3]/cloud_resource/resource/name` response property`s maxLength was unset from` 256`for the response status`200\` 
**GET - `/groups/{group_id}/issues/{issue_id}` - Updated**
* the `data/attributes/coordinates/items/representations/items/oneOf[subschema #3]/cloud_resource/resource/name` response property`s maxLength was unset from` 256`for the response status`200\` 
#### 2025-09-28
**POST - `/orgs/{org_id}/policies` - Updated**
* added the new optional request property `data/attributes/source`
#### 2024-10-15 - Updated 2025-09-29
**POST - `/orgs/{org_id}/export` - Updated**
* added the new optional request property `data/attributes/filters/product_name`
* added the new optional request property `data/attributes/filters/project_tags`
* added the new optional request property `data/attributes/filters/project_type`
**POST - `/groups/{group_id}/export` - Updated**
* added the new optional request property `data/attributes/filters/product_name`
* added the new optional request property `data/attributes/filters/project_tags`
* added the new optional request property `data/attributes/filters/project_type`
### Changelog
**2024-10-15 - Updated 2025-09-08**
**POST - `/orgs/{org_id}/policies` - Updated**
* removed the `cancelled` enum value from the `data/attributes/review` response property for the response status `201`
**GET - `/orgs/{org_id}/policies` - Updated**
* removed the `cancelled` enum value from the `data/items/attributes/review` response property for the response status `200`
**PATCH - `/orgs/{org_id}/policies/{policy_id}` - Updated**
* removed the enum value `cancelled` of the request property `data/attributes/review` 
* removed the `cancelled` enum value from the `data/attributes/review` response property for the response status `200`
**GET - `/orgs/{org_id}/policies/{policy_id}` - Updated**
* removed the `cancelled` enum value from the `data/attributes/review` response property for the response status `200`
**2024-10-15 - Updated 2025-08-15**
**GET - `/orgs/{org_id}/policies` - Updated**
* added the new enum value `ignore-type` to the `query` request parameter `order_by`
* added the new enum value `requested-by` to the `query` request parameter `order_by`
**2024-10-15 - Updated 2025-08-14**
**PATCH - `/orgs/{org_id}/policies/{policy_id}` - Updated**
* added the new `cancelled` enum value to the request property `data/attributes/review`
**2024-10-15 - Updated 2025-08-06**
**POST - `/orgs/{org_id}/export` - Updated**
* removed the request property `data/attributes/destination` 
**POST - `/groups/{group_id}/export` - Updated**
* removed the request property `data/attributes/destination` 
**2024-10-15 - Updated 2025-07-07**
**POST - `/orgs/{org_id}/service_accounts` - Updated**
* added the new `access_token` enum value to the `data/attributes/auth_type` response property for the response status `201` 
* added the new optional request property `data/attributes/access_token_expires_at`
* added the new `access_token` enum value to the request property `data/attributes/auth_type`
* added the optional property `data/attributes/access_token` to the response with the `201` status
* added the optional property `data/attributes/access_token_expires_at` to the response with the `201` status
* added the optional property `data/attributes/created_at` to the response with the `201` status
**GET - `/orgs/{org_id}/service_accounts` - Updated**
* added the new `access_token` enum value to the `data/items/attributes/auth_type` response property for the response status `200` 
* added the optional property `data/items/attributes/access_token` to the response with the `200` status
* added the optional property `data/items/attributes/access_token_expires_at` to the response with the `200` status
* added the optional property `data/items/attributes/created_at` to the response with the `200` status
**PATCH - `/orgs/{org_id}/service_accounts/{serviceaccount_id}` - Updated**
* added the new `access_token` enum value to the `data/attributes/auth_type` response property for the response status `200` 
* added the optional property `data/attributes/access_token` to the response with the `200` status
* added the optional property `data/attributes/access_token_expires_at` to the response with the `200` status
* added the optional property `data/attributes/created_at` to the response with the `200` status
**GET - `/orgs/{org_id}/service_accounts/{serviceaccount_id}` - Updated**
* added the new `access_token` enum value to the `data/attributes/auth_type` response property for the response status `200` 
* added the optional property `data/attributes/access_token` to the response with the `200` status
* added the optional property `data/attributes/access_token_expires_at` to the response with the `200` status
* added the optional property `data/attributes/created_at` to the response with the `200` status
**POST - `/orgs/{org_id}/service_accounts/{serviceaccount_id}/secrets` - Updated**
* added the new `access_token` enum value to the `data/attributes/auth_type` response property for the response status `200` 
* added the optional property `data/attributes/access_token` to the response with the `200` status
* added the optional property `data/attributes/access_token_expires_at` to the response with the `200` status
* added the optional property `data/attributes/created_at` to the response with the `200` status
**GET - `/orgs/{org_id}/packages/{purl}/issues` - Updated**
* added `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` to the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**POST - `/orgs/{org_id}/packages/issues` - Updated**
* added `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` to the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**POST - `/groups/{group_id}/service_accounts` - Updated**
* added the new `access_token` enum value to the `data/attributes/auth_type` response property for the response status `201` 
* added the new optional request property `data/attributes/access_token_expires_at`
* added the new `access_token` enum value to the request property `data/attributes/auth_type`
* added the optional property `data/attributes/access_token` to the response with the `201` status
* added the optional property `data/attributes/access_token_expires_at` to the response with the `201` status
* added the optional property `data/attributes/created_at` to the response with the `201` status
**GET - `/groups/{group_id}/service_accounts` - Updated**
* added the new `access_token` enum value to the `data/items/attributes/auth_type` response property for the response status `200` 
* added the optional property `data/items/attributes/access_token` to the response with the `200` status
* added the optional property `data/items/attributes/access_token_expires_at` to the response with the `200` status
* added the optional property `data/items/attributes/created_at` to the response with the `200` status
* added the optional property `meta` to the response with the `200` status
**PATCH - `/groups/{group_id}/service_accounts/{serviceaccount_id}` - Updated**
* added the new `access_token` enum value to the `data/attributes/auth_type` response property for the response status `200` 
* added the optional property `data/attributes/access_token` to the response with the `200` status
* added the optional property `data/attributes/access_token_expires_at` to the response with the `200` status
* added the optional property `data/attributes/created_at` to the response with the `200` status
**GET - `/groups/{group_id}/service_accounts/{serviceaccount_id}` - Updated**
* added the new `access_token` enum value to the `data/attributes/auth_type` response property for the response status `200` 
* added the optional property `data/attributes/access_token` to the response with the `200` status
* added the optional property `data/attributes/access_token_expires_at` to the response with the `200` status
* added the optional property `data/attributes/created_at` to the response with the `200` status
**POST - `/groups/{group_id}/service_accounts/{serviceaccount_id}/secrets` - Updated**
* added the new `access_token` enum value to the `data/attributes/auth_type` response property for the response status `200` 
* added the optional property `data/attributes/access_token` to the response with the `200` status
* added the optional property `data/attributes/access_token_expires_at` to the response with the `200` status
* added the optional property `data/attributes/created_at` to the response with the `200` status
#### Changelog
**2024-10-15 - Updated 2025-06-12**
**GET - `/tenants` - Added**
* Get a list of all Tenants which the calling user is a member of
**PATCH - `/tenants/{tenant_id}` - Added**
* Update the details of a tenant
**Required permissions**
* `Edit Tenant Details (tenant.edit)`
**GET - `/tenants/{tenant_id}` - Added**
* Get the full details of a Tenant.
**Required permissions**
* `View Tenant Details (tenant.read)`
**POST - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/connections/{connection_id}/bulk_migration` - Added**
* Performs bulk migration for integrations from legacy to universal broker
**Required permissions**
* `View Tenant Details (tenant.read)`
**GET - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/connections/{connection_id}/bulk_migration` - Added**
* Lists organization IDs associated with a connection type to be bulk migrated to universal broker
**Required permissions**
* `View Tenant Details (tenant.read)`
**GET - `/orgs/{org_id}` - Updated**
* added the new optional `query` request parameter `expand`
* added the optional property `data/relationships` to the response with the `200` status
**GET - `/orgs/{org_id}/projects` - Updated**
* added the optional property `data/items/attributes/settings/auto_dependency_upgrade/is_inherited` to the response with the `200` status
**PATCH - `/orgs/{org_id}/projects/{project_id}` - Updated**
* added the optional property `data/attributes/settings/auto_dependency_upgrade/is_inherited` to the response with the `200` status
**GET - `/orgs/{org_id}/projects/{project_id}` - Updated**
* added the optional property `data/attributes/settings/auto_dependency_upgrade/is_inherited` to the response with the `200` status
**GET - `/orgs/{org_id}/packages/{purl}/issues` - Updated**
* removed `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` from the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**POST - `/orgs/{org_id}/packages/issues` - Updated**
* removed `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` from the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**2024-10-15 - Updated 2025-06-04**
**GET - `/groups/{group_id}/assets/{asset_id}` - Added**
* Get an Asset by its ID
**Required permissions**
* `View Groups (group.read)`
**GET - `/groups/{group_id}/assets/{asset_id}/relationships/projects` - Added**
* List asset projects with pagination
**Required permissions**
* `View Groups (group.read)`
**GET - `/groups/{group_id}/assets/{asset_id}/relationships/assets` - Added**
* List related assets with pagination
**Required permissions**
* `View Groups (group.read)`
**POST - `/groups/{group_id}/assets/search` - Added**
* List Assets with filters
**Required permissions**
* `View Groups (group.read)`
**2024-10-15 - Updated 2025-05-27**
**GET - `/orgs/{org_id}` - Updated**
* the response property `data` became optional for the status `200`
* the response property `jsonapi` became optional for the status `200`
* the response property `links` became optional for the status `200`
* removed the optional property `links/first` from the response with the `200` status
* removed the optional property `links/last` from the response with the `200` status
* removed the optional property `links/next` from the response with the `200` status
* removed the optional property `links/prev` from the response with the `200` status
* removed the optional property `links/related` from the response with the `200` status
* added the non-success response with the status `409`
* added the optional property `data/attributes/created_at` to the response with the `200` status
* added the optional property `data/attributes/updated_at` to the response with the `200` status
* the response property `data/attributes` became required for the status `200`
* the `data/type` response`s property pattern` ^\[a-z]\[a-z0-9]*(\_\[a-z]\[a-z0-9]*)\*$`was added for the status`200\`
**2024-10-15 - Updated 2025-05-20**
**GET - `/orgs/{org_id}/projects` - Updated**
* added the optional property `data/items/attributes/settings/pull_requests/is_enabled` to the response with the `200` status
**PATCH - `/orgs/{org_id}/projects/{project_id}` - Updated**
* added the optional property `data/attributes/settings/pull_requests/is_enabled` to the response with the `200` status
**GET - `/orgs/{org_id}/projects/{project_id}` - Updated**
* added the optional property `data/attributes/settings/pull_requests/is_enabled` to the response with the `200` status
**2024-10-15 - Updated 2025-05-15**
**GET - `/orgs/{org_id}/policies` - Updated**
* added the new optional `query` request parameter `order_by`
* added the new optional `query` request parameter `order_direction`
**2024-10-15 - Updated 2025-05-08**
**GET - `/orgs/{org_id}/packages/{purl}/issues` - Updated**
* added `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` to the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**POST - `/orgs/{org_id}/packages/issues` - Updated**
* added `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` to the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**2024-10-15 - Updated 2025-04-28**
**GET - `/orgs/{org_id}/packages/{purl}/issues` - Updated**
* removed `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` from the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**POST - `/orgs/{org_id}/packages/issues` - Updated**
* removed `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` from the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**GET - `/orgs/{org_id}/issues` - Updated**
* added the optional property `data/items/attributes/coordinates/items/representations/items/oneOf[subschema #4]/sourceLocation/commit_id` to the response with the `200` status
* added the optional property `data/items/attributes/key_asset` to the response with the `200` status
**GET - `/orgs/{org_id}/issues/{issue_id}` - Updated**
* added the optional property `data/attributes/coordinates/items/representations/items/oneOf[subschema #4]/sourceLocation/commit_id` to the response with the `200` status
* added the optional property `data/attributes/key_asset` to the response with the `200` status
**GET - `/groups/{group_id}/issues` - Updated**
* added the optional property `data/items/attributes/coordinates/items/representations/items/oneOf[subschema #4]/sourceLocation/commit_id` to the response with the `200` status
* added the optional property `data/items/attributes/key_asset` to the response with the `200` status
**GET - `/groups/{group_id}/issues/{issue_id}` - Updated**
* added the optional property `data/attributes/coordinates/items/representations/items/oneOf[subschema #4]/sourceLocation/commit_id` to the response with the `200` status
* added the optional property `data/attributes/key_asset` to the response with the `200` status
**2024-10-15 - Updated 2025-04-25**
**POST - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments` - Added**
* Creates a new Broker Deployment for an installation
**GET - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments` - Added**
* List Broker deployments for a given install ID
**PATCH - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}` - Added**
* Updates a Broker deployment for a given install ID
**DELETE - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}` - Added**
* Delete a Broker deployment for a given install ID
**POST - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/credentials` - Added**
* Creates a new Deployment credential
**GET - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/credentials` - Added**
* List Deployment credentials for a given deployment ID
**PATCH - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/credentials/{credential_id}` - Added**
* Updates a Deployment credential for an deployment
**GET - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/credentials/{credential_id}` - Added**
* Get all Deployment credential data for an deployment
**DELETE - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/credentials/{credential_id}` - Added**
* Deletes an existing Deployment credential for an deployment
**POST - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/contexts` - Added**
* Creates a new Broker Context
**GET - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/contexts` - Added**
* List Deployment contexts for a given deployment ID
**POST - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/connections` - Added**
* Creates a new Broker connection for an deployment
**GET - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/connections` - Added**
* List all Broker connections for a given deployment
**DELETE - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/connections` - Added**
* Deletes all existing Broker connections for an deployment
**PATCH - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/connections/{connection_id}` - Added**
* Updates a Broker connection for an deployment
**GET - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/connections/{connection_id}` - Added**
* Get all Broker connection data for an deployment
**DELETE - `/tenants/{tenant_id}/brokers/installs/{install_id}/deployments/{deployment_id}/connections/{connection_id}` - Added**
* Deletes an existing Broker connection for an deployment
**PATCH - `/tenants/{tenant_id}/brokers/installs/{install_id}/contexts/{context_id}` - Added**
* Updates a Broker Context for an deployment
**GET - `/tenants/{tenant_id}/brokers/installs/{install_id}/contexts/{context_id}` - Added**
* List Broker context for a given broker context ID
**DELETE - `/tenants/{tenant_id}/brokers/installs/{install_id}/contexts/{context_id}` - Added**
* Deletes an existing broker context
**DELETE - `/tenants/{tenant_id}/brokers/installs/{install_id}/contexts/{context_id}/integrations/{integration_id}` - Added**
* Deletes an existing Broker context association for an integration
**PATCH - `/tenants/{tenant_id}/brokers/installs/{install_id}/contexts/{context_id}/integration` - Added**
* Updates an integration to be associated with a Broker context
**GET - `/tenants/{tenant_id}/brokers/installs/{install_id}/connections/{connection_id}/contexts` - Added**
* List Broker contexts for a given broker connection ID
**GET - `/tenants/{tenant_id}/brokers/deployments` - Added**
* List Broker deployments for the tenant
**DELETE - `/tenants/{tenant_id}/brokers/connections/{connection_id}/orgs/{org_id}/integrations/{integration_id}` - Added**
* Deletes an existing Broker connection for an deployment
**POST - `/tenants/{tenant_id}/brokers/connections/{connection_id}/orgs/{org_id}/integration` - Added**
* Configures integrations to use the Broker connection for an deployment
**GET - `/tenants/{tenant_id}/brokers/connections/{connection_id}/integrations` - Added**
* Get all integrations using the Broker connection
**POST - `/orgs/{org_id}/policies` - Updated**
* the `data/attributes/action/data/reason` request property`s maxLength was set to` 10000\`
* the `data/attributes/name` request property`s maxLength was set to` 255\`
**PATCH - `/orgs/{org_id}/policies/{policy_id}` - Updated**
* the `data/attributes/action/data/reason` request property`s maxLength was set to` 10000\`
* the `data/attributes/name` request property`s maxLength was set to` 255\`
**GET - `/orgs/{org_id}/brokers/connections` - Added**
* List all Broker connections integrated with a given org
**2024-10-15 - Updated 2025-04-22**
**GET - `/orgs/{org_id}/policies` - Updated**
* added the new optional `query` request parameter `search`
**2024-10-15 - Updated 2025-04-02**
**POST - `/orgs/{org_id}/policies` - Updated**
* added the new optional request property `data/meta`
**2024-10-15 - Updated 2025-04-01**
**POST - `/orgs/{org_id}/memberships` - Updated**
* the request property `data` became required
* the request property `data/relationships` became required
* the request property `data/relationships/org/data` became required
* the request property `data/relationships/org/data/id` became required
* the request property `data/relationships/org/data/type` became required
* the request property `data/relationships/role/data` became required
* the request property `data/relationships/role/data/id` became required
* the request property `data/relationships/role/data/type` became required
* the request property `data/relationships/user/data` became required
* the request property `data/relationships/user/data/id` became required
* the request property `data/relationships/user/data/type` became required
* the request property `data/type` became required
**2024-10-15 - Updated 2025-03-19**
**PATCH - `/orgs/{org_id}/memberships/{membership_id}` - Updated**
* the request property `data` became required
**Changelog**
**2024-10-15 - Updated 2025-03-04**
**GET - `/orgs/{org_id}/packages/{purl}/issues` - Updated**
* added `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` to the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**POST - `/orgs/{org_id}/packages/issues` - Updated**
* added `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` to the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**2024-10-15 - Updated 2025-02-11**
**GET - `/orgs/{org_id}/packages/{purl}/issues` - Updated**
* removed `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` from the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**POST - `/orgs/{org_id}/packages/issues` - Updated**
* removed `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` from the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**2024-10-15 - Updated 2025-02-04**
**POST - `/orgs/{org_id}/policies` - Added**
* Create a new org-level policy.
*Org level Policy APIs Access Notice:* Access to our Org level Policy APIs is currently restricted via "snykCodeConsistentIgnores" feature flag and will result in a 403 Forbidden error without the flag enabled. Please contact your account representative for eligibility requirements.
**GET - `/orgs/{org_id}/policies` - Added**
* Get all policies for the requested organisation.
*Org level Policy APIs Access Notice:* Access to our Org level Policy APIs is currently restricted via "snykCodeConsistentIgnores" feature flag and will result in a 403 Forbidden error without the flag enabled. Please contact your account representative for eligibility requirements.
**PATCH - `/orgs/{org_id}/policies/{policy_id}` - Added**
* Update the org-level policy.
*Org level Policy APIs Access Notice:* Access to our Org level Policy APIs is currently restricted via "snykCodeConsistentIgnores" feature flag and will result in a 403 Forbidden error without the flag enabled. Please contact your account representative for eligibility requirements.
**GET - `/orgs/{org_id}/policies/{policy_id}` - Added**
* Get a specific org-level policy based on its ID.
*Org level Policy APIs Access Notice:* Access to our Org level Policy APIs is currently restricted via "snykCodeConsistentIgnores" feature flag and will result in a 403 Forbidden error without the flag enabled. Please contact your account representative for eligibility requirements.
**DELETE - `/orgs/{org_id}/policies/{policy_id}` - Added**
* Delete an existing org-level policy.
*Org level Policy APIs Access Notice:* Access to our Org level Policy APIs is currently restricted via "snykCodeConsistentIgnores" feature flag and will result in a 403 Forbidden error without the flag enabled. Please contact your account representative for eligibility requirements.
**2024-10-15 - Updated 2025-01-22**
**GET - `/orgs/{org_id}/packages/{purl}/issues` - Updated**
* added `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` to the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**POST - `/orgs/{org_id}/packages/issues` - Updated**
* added `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` to the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**2024-10-15 - Updated 2025-01-13**
**GET - `/orgs/{org_id}/issues` - Updated**
* added the optional property `data/items/attributes/exploit_details` to the response with the `200` status
* added the optional property `data/items/attributes/severities` to the response with the `200` status
**GET - `/orgs/{org_id}/issues/{issue_id}` - Updated**
* added the optional property `data/attributes/exploit_details` to the response with the `200` status
* added the optional property `data/attributes/severities` to the response with the `200` status
**GET - `/groups/{group_id}/issues` - Updated**
* added the optional property `data/items/attributes/exploit_details` to the response with the `200` status
* added the optional property `data/items/attributes/severities` to the response with the `200` status
**GET - `/groups/{group_id}/issues/{issue_id}` - Updated**
* added the optional property `data/attributes/exploit_details` to the response with the `200` status
* added the optional property `data/attributes/severities` to the response with the `200` status
**2024-10-15 - Updated 2025-01-07**
**GET - `/orgs/{org_id}/issues` - Updated**
* added `loaded_package` discriminator mapping keys to the `data/items/attributes/risk/factors/items/` response property for the response status `200`
* added `#/components/schemas/LoadedPackageRiskFactor` to the `data/items/attributes/risk/factors/items/` response property `oneOf` list for the response status `200`
**GET - `/orgs/{org_id}/issues/{issue_id}` - Updated**
* added `loaded_package` discriminator mapping keys to the `data/attributes/risk/factors/items/` response property for the response status `200`
* added `#/components/schemas/LoadedPackageRiskFactor` to the `data/attributes/risk/factors/items/` response property `oneOf` list for the response status `200`
**GET - `/groups/{group_id}/issues` - Updated**
* added `loaded_package` discriminator mapping keys to the `data/items/attributes/risk/factors/items/` response property for the response status `200`
* added `#/components/schemas/LoadedPackageRiskFactor` to the `data/items/attributes/risk/factors/items/` response property `oneOf` list for the response status `200`
**GET - `/groups/{group_id}/issues/{issue_id}` - Updated**
* added `loaded_package` discriminator mapping keys to the `data/attributes/risk/factors/items/` response property for the response status `200`
* added `#/components/schemas/LoadedPackageRiskFactor` to the `data/attributes/risk/factors/items/` response property `oneOf` list for the response status `200`
**2024-10-15 - Updated 2024-12-09**
**GET - `/orgs/{org_id}` - Updated**
* the response property `data` became optional for the status `200` 
* the response property `jsonapi` became optional for the status `200` 
* the response property `links` became optional for the status `200` 
* removed the optional property `links/first` from the response with the `200` status 
* removed the optional property `links/last` from the response with the `200` status 
* removed the optional property `links/next` from the response with the `200` status 
* removed the optional property `links/prev` from the response with the `200` status 
* removed the optional property `links/related` from the response with the `200` status 
* added the non-success response with the status `409`
* added the optional property `data/attributes/created_at` to the response with the `200` status
* added the optional property `data/attributes/updated_at` to the response with the `200` status
* the response property `data/attributes` became required for the status `200`
* the `data/type` response`s property pattern` ^\[a-z]\[a-z0-9]*(\_\[a-z]\[a-z0-9]*)\*$`was added for the status`200\`
**2024-10-15 - Updated 2024-11-28**
**GET - `/orgs/{org_id}/projects/{project_id}/sbom` - Updated**
* added the new optional `query` request parameter `exclude`
**2024-10-15 - Updated 2024-11-06**
**GET - `/orgs/{org_id}/packages/{purl}/issues` - Updated**
* removed `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` from the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**POST - `/orgs/{org_id}/packages/issues` - Updated**
* removed `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` from the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**2024-10-15 - Updated 2024-10-31**
**GET - `/orgs/{org_id}/packages/{purl}/issues` - Updated**
* added `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` to the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**POST - `/orgs/{org_id}/packages/issues` - Updated**
* added `#/components/schemas/ResourcePathRepresentation, #/components/schemas/PackageRepresentation` to the `data/items/attributes/coordinates/items/representations/items/` response property `anyOf` list for the response status `200`
**2024-10-15 - Updated 2024-10-30**
**GET - `/orgs/{org_id}/issues` - Updated**
* added the new `function` enum value to the `data/items/attributes/coordinates/items/reachability` response property for the response status `200` 
* added the new `no-info` enum value to the `data/items/attributes/coordinates/items/reachability` response property for the response status `200` 
* added the new `not-applicable` enum value to the `data/items/attributes/coordinates/items/reachability` response property for the response status `200` 
* added the new `package` enum value to the `data/items/attributes/coordinates/items/reachability` response property for the response status `200` 
**GET - `/orgs/{org_id}/issues/{issue_id}` - Updated**
* added the new `function` enum value to the `data/attributes/coordinates/items/reachability` response property for the response status `200` 
* added the new `no-info` enum value to the `data/attributes/coordinates/items/reachability` response property for the response status `200` 
* added the new `not-applicable` enum value to the `data/attributes/coordinates/items/reachability` response property for the response status `200` 
* added the new `package` enum value to the `data/attributes/coordinates/items/reachability` response property for the response status `200` 
**GET - `/groups/{group_id}/issues` - Updated**
* added the new `function` enum value to the `data/items/attributes/coordinates/items/reachability` response property for the response status `200` 
* added the new `no-info` enum value to the `data/items/attributes/coordinates/items/reachability` response property for the response status `200` 
* added the new `not-applicable` enum value to the `data/items/attributes/coordinates/items/reachability` response property for the response status `200` 
* added the new `package` enum value to the `data/items/attributes/coordinates/items/reachability` response property for the response status `200` 
**GET - `/groups/{group_id}/issues/{issue_id}` - Updated**
* added the new `function` enum value to the `data/attributes/coordinates/items/reachability` response property for the response status `200` 
* added the new `no-info` enum value to the `data/attributes/coordinates/items/reachability` response property for the response status `200` 
* added the new `not-applicable` enum value to the `data/attributes/coordinates/items/reachability` response property for the response status `200` 
* added the new `package` enum value to the `data/attributes/coordinates/items/reachability` response property for the response status `200` 
**2024-10-15**
**Simplified API Versioning**
Going forward, Snyk will expose one API specification per version-date, rather than one for each stability. New versions of the Snyk API will only be published when necessitated by breaking changes. For newer versions, you should only specify the date for beta versions, i.e `2024-10-15` rather than `2024-10-15~beta`. It's important to note that existing versions won't be affected by these changes; this new approach only applies to upcoming new versions.
**2024-08-25 - Updated 2024-10-10**
**GET - `/self` - Updated**
* added `#/components/schemas/User20240422, #/components/schemas/ServiceAccount20240422` to the `data/attributes` response property `anyOf` list for the response status `200`
* removed `#/components/schemas/ServiceAccount` from the `data/attributes` response property `anyOf` list for the response status `200`
**GET - `/orgs/{org_id}/projects` - Updated**
* added `#/components/schemas/ProjectRelationshipsTarget20230215` to the `data/items/relationships/target` response property `oneOf` list for the response status `200`
* removed `#/components/schemas/ProjectRelationshipsTarget` from the `data/items/relationships/target` response property `oneOf` list for the response status `200`
**PATCH - `/orgs/{org_id}/projects/{project_id}` - Updated**
* added `#/components/schemas/ProjectRelationshipsTarget20230215` to the `data/relationships/target` response property `oneOf` list for the response status `200`
* removed `#/components/schemas/ProjectRelationshipsTarget` from the `data/relationships/target` response property `oneOf` list for the response status `200`
**GET - `/orgs/{org_id}/projects/{project_id}` - Updated**
* added `#/components/schemas/ProjectRelationshipsTarget20230215` to the `data/relationships/target` response property `oneOf` list for the response status `200`
* removed `#/components/schemas/ProjectRelationshipsTarget` from the `data/relationships/target` response property `oneOf` list for the response status `200`
**GET - `/orgs/{org_id}/packages/{purl}/issues` - Updated**
* removed the optional property `data/items/attributes/coordinates/items/representation` from the response with the `200` status 
* removed the optional property `data/items/attributes/key` from the response with the `200` status 
* removed the optional property `data/items/attributes/slots/exploit` from the response with the `200` status 
* added the optional property `data/items/attributes/severities/items/type` to the response with the `200` status
* added the optional property `data/items/attributes/severities/items/version` to the response with the `200` status
* added the optional property `data/items/attributes/slots/exploit_details` to the response with the `200` status
* added the required property `data/items/attributes/coordinates/items/representations` to the response with the `200` status
**POST - `/orgs/{org_id}/packages/issues` - Updated**
* removed the optional property `data/items/attributes/slots/exploit` from the response with the `200` status 
* added the optional property `data/items/attributes/severities/items/type` to the response with the `200` status
* added the optional property `data/items/attributes/severities/items/version` to the response with the `200` status
* added the optional property `data/items/attributes/slots/exploit_details` to the response with the `200` status
**GET - `/orgs/{org_id}/invites` - Updated**
* the `data/items/attributes/role` response`s property type/format changed from` string`/`uuid`to`string`/`` for status` 200\` 
* removed the `org_invitation` enum value from the `data/items/type` response property for the response status `200`
**2024-08-25 - Updated 2024-09-11**
**POST - `/orgs/{org_id}/apps` - Updated**
* added the new required request property `name` 
* added the new required request property `redirect_uris` 
* added the new required request property `scopes` 
* removed the request property `data` 
* added the new optional request property `access_token_ttl_seconds`
* added the new optional request property `context`
**GET - `/orgs/{org_id}/apps` - Updated**
* the `data/items/attributes/redirect_uris` response property`s minItems was decreased from` 1`to`0`for the response status`200\` 
* the response property `data/items/attributes/client_id` became required for the status `200`
* the response property `data/items/attributes/redirect_uris` became required for the status `200`
**PATCH - `/orgs/{org_id}/apps/{client_id}` - Updated**
* the `data/attributes/redirect_uris` response property`s minItems was decreased from` 1`to`0`for the response status`200\` 
* removed the request property `data` 
* added the new optional request property `access_token_ttl_seconds`
* added the new optional request property `name`
* added the new optional request property `redirect_uris`
* the response property `data/attributes/client_id` became required for the status `200`
* the response property `data/attributes/redirect_uris` became required for the status `200`
**GET - `/orgs/{org_id}/apps/{client_id}` - Updated**
* the `data/attributes/redirect_uris` response property`s minItems was decreased from` 1`to`0`for the response status`200\` 
* the response property `data/attributes/client_id` became required for the status `200`
* the response property `data/attributes/redirect_uris` became required for the status `200`
**2024-08-25 - Updated 2024-09-03**
**POST - `/groups/{group_id}/memberships` - Updated**
* the response property `data/relationships/group` became required for the status `201`
* the response property `data/relationships/group/data/attributes` became required for the status `201`
* the response property `data/relationships/group/data/attributes/name` became required for the status `201`
* the response property `data/relationships/group/data/id` became required for the status `201`
* the response property `data/relationships/role` became required for the status `201`
* the response property `data/relationships/role/data/attributes` became required for the status `201`
* the response property `data/relationships/role/data/attributes/name` became required for the status `201`
* the response property `data/relationships/role/data/id` became required for the status `201`
* the response property `data/relationships/user` became required for the status `201`
* the response property `data/relationships/user/data/attributes` became required for the status `201`
* the response property `data/relationships/user/data/attributes/email` became required for the status `201`
* the response property `data/relationships/user/data/attributes/name` became required for the status `201`
* the response property `data/relationships/user/data/attributes/username` became required for the status `201`
* the response property `data/relationships/user/data/id` became required for the status `201`
**GET - `/groups/{group_id}/memberships` - Updated**
* the response property `data/items/relationships/group` became required for the status `200`
* the response property `data/items/relationships/group/data/attributes` became required for the status `200`
* the response property `data/items/relationships/group/data/attributes/name` became required for the status `200`
* the response property `data/items/relationships/group/data/id` became required for the status `200`
* the response property `data/items/relationships/role` became required for the status `200`
* the response property `data/items/relationships/role/data/attributes` became required for the status `200`
* the response property `data/items/relationships/role/data/attributes/name` became required for the status `200`
* the response property `data/items/relationships/role/data/id` became required for the status `200`
* the response property `data/items/relationships/user` became required for the status `200`
* the response property `data/items/relationships/user/data/attributes` became required for the status `200`
* the response property `data/items/relationships/user/data/attributes/email` became required for the status `200`
* the response property `data/items/relationships/user/data/attributes/name` became required for the status `200`
* the response property `data/items/relationships/user/data/attributes/username` became required for the status `200`
* the response property `data/items/relationships/user/data/id` became required for the status `200`
**2024-08-25 - Updated 2024-08-30**
**POST - `/orgs/{org_id}/memberships` - Updated**
* the response property `data/relationships/org` became required for the status `201`
* the response property `data/relationships/org/data/attributes` became required for the status `201`
* the response property `data/relationships/org/data/attributes/name` became required for the status `201`
* the response property `data/relationships/org/data/id` became required for the status `201`
* the response property `data/relationships/role` became required for the status `201`
* the response property `data/relationships/role/data/attributes` became required for the status `201`
* the response property `data/relationships/role/data/attributes/name` became required for the status `201`
* the response property `data/relationships/role/data/id` became required for the status `201`
* the response property `data/relationships/user` became required for the status `201`
* the response property `data/relationships/user/data/attributes` became required for the status `201`
* the response property `data/relationships/user/data/attributes/email` became required for the status `201`
* the response property `data/relationships/user/data/attributes/name` became required for the status `201`
* the response property `data/relationships/user/data/attributes/username` became required for the status `201`
* the response property `data/relationships/user/data/id` became required for the status `201`
**GET - `/orgs/{org_id}/memberships` - Updated**
* the response property `data/items/relationships/org` became required for the status `200`
* the response property `data/items/relationships/org/data/attributes` became required for the status `200`
* the response property `data/items/relationships/org/data/attributes/name` became required for the status `200`
* the response property `data/items/relationships/org/data/id` became required for the status `200`
* the response property `data/items/relationships/role` became required for the status `200`
* the response property `data/items/relationships/role/data/attributes` became required for the status `200`
* the response property `data/items/relationships/role/data/attributes/name` became required for the status `200`
* the response property `data/items/relationships/role/data/id` became required for the status `200`
* the response property `data/items/relationships/user` became required for the status `200`
* the response property `data/items/relationships/user/data/attributes` became required for the status `200`
* the response property `data/items/relationships/user/data/attributes/email` became required for the status `200`
* the response property `data/items/relationships/user/data/attributes/name` became required for the status `200`
* the response property `data/items/relationships/user/data/attributes/username` became required for the status `200`
* the response property `data/items/relationships/user/data/id` became required for the status `200`
**2024-08-25**
**POST - `/orgs/{org_id}/memberships` - Added**
* Create a org membership for a user with role
**GET - `/orgs/{org_id}/memberships` - Added**
* Returns all memberships of the org
**PATCH - `/orgs/{org_id}/memberships/{membership_id}` - Added**
* Update a org membership for a user with role
**DELETE - `/orgs/{org_id}/memberships/{membership_id}` - Added**
* Remove a user\`s membership of the group.
**GET - `/groups/{group_id}/org_memberships` - Added**
* Get list of org memberships of a group user
**POST - `/groups/{group_id}/memberships` - Added**
* Create a group membership for a user with role
**GET - `/groups/{group_id}/memberships` - Added**
* Returns all memberships of the group
**PATCH - `/groups/{group_id}/memberships/{membership_id}` - Added**
* Update a role from a group membership
**DELETE - `/groups/{group_id}/memberships/{membership_id}` - Added**
* Deletes a membership from a group
**2024-08-22**
**GET - `/orgs/{org_id}/projects/{project_id}/sbom` - Updated**
* removed the required property `bomFormat` from the response with the `200` status 
* removed the required property `components` from the response with the `200` status 
* removed the required property `dependencies` from the response with the `200` status 
* removed the required property `dependencies` from the response with the `200` status 
* removed the required property `metadata` from the response with the `200` status 
* removed the required property `metadata` from the response with the `200` status 
* removed the required property `specVersion` from the response with the `200` status 
* removed the required property `version` from the response with the `200` status 
* removed the optional property `components` from the response with the `200` status 
* added the new enum value `cyclonedx1.5+json` to the `query` request parameter `format`
* added the new enum value `cyclonedx1.5+xml` to the `query` request parameter `format`
* added the new enum value `cyclonedx1.6+json` to the `query` request parameter `format`
* added the new enum value `cyclonedx1.6+xml` to the `query` request parameter `format`
**2024-08-15**
**GET - `/orgs/{org_id}/audit_logs/search` - Updated**
* for the `query` request parameter `size`, default value `100.00` was added 
**GET - `/groups/{group_id}/audit_logs/search` - Updated**
* for the `query` request parameter `size`, default value `100.00` was added 
**2024-06-21 - Updated 2024-06-27**
**POST - `/orgs/{org_id}/collections` - Updated**
* the `data/attributes/name` response property's maxLength was unset from `255` for the response status `201` 
* the `data/attributes/name` response property's minLength was decreased from `1` to `0` for the response status `201` 
* the `data/attributes/name` response's property pattern `^([a-zA-Z0-9 _\-\/:.])+$` was removed for the status `201`
**GET - `/orgs/{org_id}/collections` - Updated**
* the `data/items/attributes/name` response property's maxLength was unset from `255` for the response status `200` 
* the `data/items/attributes/name` response property's minLength was decreased from `1` to `0` for the response status `200` 
* the `data/items/attributes/name` response's property pattern `^([a-zA-Z0-9 _\-\/:.])+$` was removed for the status `200`
**PATCH - `/orgs/{org_id}/collections/{collection_id}` - Updated**
* the `data/attributes/name` response property's maxLength was unset from `255` for the response status `200` 
* the `data/attributes/name` response property's minLength was decreased from `1` to `0` for the response status `200` 
* the `data/attributes/name` response's property pattern `^([a-zA-Z0-9 _\-\/:.])+$` was removed for the status `200`
**GET - `/orgs/{org_id}/collections/{collection_id}` - Updated**
* the `data/attributes/name` response property's maxLength was unset from `255` for the response status `200` 
* the `data/attributes/name` response property's minLength was decreased from `1` to `0` for the response status `200` 
* the `data/attributes/name` response's property pattern `^([a-zA-Z0-9 _\-\/:.])+$` was removed for the status `200`
**2024-06-21 - Updated 2024-06-25**
**PATCH - `/orgs/{org_id}` - Updated**
* request property `data/type` was restricted to a list of enum values 
* the request property `data/attributes` became required 
* the request property `data/id` became required 
* the request property `data/type` became required 
* added the new `org` enum value to the `data/type` response property for the response status `200` 
* added the new `org` enum value to the request property `data/type`
* removed the pattern `^[a-z][a-z0-9]*(_[a-z][a-z0-9]*)*$` from the request property `data/type`
* the `data/type` response's property pattern `^[a-z][a-z0-9]*(_[a-z][a-z0-9]*)*$` was removed for the status `200`
**2024-06-21**
**POST - `/orgs/{org_id}/invites` - Updated**
* removed the request property `data/relationships` 
**2024-06-18**
**POST - `/groups/{group_id}/settings/pull_request_template` - Updated**
* removed the request property `data/attributes/branch_name` 
* removed the optional property `data/attributes/branch_name` from the response with the `201` status 
**GET - `/groups/{group_id}/settings/pull_request_template` - Updated**
* removed the optional property `data/attributes/branch_name` from the response with the `200` status 
**2024-06-06**
**GET - `/orgs/{org_id}/projects` - Updated**
* removed the optional property `data/items/attributes/settings/auto_dependency_upgrade/is_inherited` from the response with the `200` status 
**PATCH - `/orgs/{org_id}/projects/{project_id}` - Updated**
* removed the optional property `data/attributes/settings/auto_dependency_upgrade/is_inherited` from the response with the `200` status 
**GET - `/orgs/{org_id}/projects/{project_id}` - Updated**
* removed the optional property `data/attributes/settings/auto_dependency_upgrade/is_inherited` from the response with the `200` status 
**2024-05-23**
**DELETE - `/self/apps/installs/{install_id}` - Updated**
* api operation id `deleteUserAppInstallByID` removed and replaced with `deleteUserAppInstallById`
**DELETE - `/orgs/{org_id}/apps/installs/{install_id}` - Updated**
* api operation id `deleteAppOrgInstallByID` removed and replaced with `deleteAppOrgInstallById`
**DELETE - `/groups/{group_id}/apps/installs/{install_id}` - Updated**
* api operation id `deleteGroupAppInstallByID` removed and replaced with `deleteGroupAppInstallById`
**2024-05-08**
**POST - `/groups/{group_id}/settings/pull_request_template` - Added**
* Configures a group level pull request template that will be used on any org or project within that group
**GET - `/groups/{group_id}/settings/pull_request_template` - Added**
* Get your groups pull request template
**DELETE - `/groups/{group_id}/settings/pull_request_template` - Added**
* Delete your groups pull request template. This means Snyk pull requests will start to use the default template for this group.
**2024-04-29**
**GET - `/orgs/{org_id}/audit_logs/search` - Updated**
* deleted the `query` request parameter `event` 
* deleted the `query` request parameter `exclude_event` 
* added the new optional `query` request parameter `events`
* added the new optional `query` request parameter `exclude_events`
**GET - `/groups/{group_id}/audit_logs/search` - Updated**
* deleted the `query` request parameter `event` 
* deleted the `query` request parameter `exclude_event` 
* added the new optional `query` request parameter `events`
* added the new optional `query` request parameter `exclude_events`
**2024-04-22**
**GET - `/self` - Added**
* Retrieves information about the the user making the request.
**GET - `/orgs/{org_id}/projects` - Updated**
* added `#/components/schemas/Relationship, #/components/schemas/ProjectRelationshipsTarget` to the `data/items/relationships/target` response property `oneOf` list for the response status `200`
**PATCH - `/orgs/{org_id}/projects/{project_id}` - Updated**
* added `#/components/schemas/Relationship, #/components/schemas/ProjectRelationshipsTarget` to the `data/relationships/target` response property `oneOf` list for the response status `200`
**GET - `/orgs/{org_id}/projects/{project_id}` - Updated**
* added `#/components/schemas/Relationship, #/components/schemas/ProjectRelationshipsTarget` to the `data/relationships/target` response property `oneOf` list for the response status `200`
**2024-02-28**
**GET - `/orgs` - Updated**
* for the `query` request parameter `name`, the maxLength was set to `100` 
* for the `query` request parameter `slug`, the maxLength was set to `100` 
* added the pattern `^[\w.-]+$` to the `query` request parameter `slug` 
* added the optional property `data/items/attributes/access_requests_enabled` to the response with the `200` status
**PATCH - `/orgs/{org_id}` - Updated**
* added the optional property `data/attributes/access_requests_enabled` to the response with the `200` status
**GET - `/orgs/{org_id}` - Updated**
* added the optional property `data/attributes/access_requests_enabled` to the response with the `200` status
**GET - `/groups/{group_id}/orgs` - Added**
* Get a paginated list of all the organizations belonging to the group. By default, this endpoint returns the organizations in alphabetical order of their name.
**2024-02-21**
**GET - `/orgs/{org_id}/targets` - Added**
* Get a list of an organization\`s targets.
**GET - `/orgs/{org_id}/targets/{target_id}` - Added**
* Get a specified target for an organization.
**DELETE - `/orgs/{org_id}/targets/{target_id}` - Added**
* Delete the specified target.
**GET - `/orgs/{org_id}/projects` - Updated**
* removed `#/components/schemas/Relationship, #/components/schemas/ProjectRelationshipsTarget` from the `data/items/relationships/target` response property `oneOf` list for the response status `200`
**PATCH - `/orgs/{org_id}/projects/{project_id}` - Updated**
* removed `#/components/schemas/Relationship, #/components/schemas/ProjectRelationshipsTarget` from the `data/relationships/target` response property `oneOf` list for the response status `200`
**GET - `/orgs/{org_id}/projects/{project_id}` - Updated**
* removed `#/components/schemas/Relationship, #/components/schemas/ProjectRelationshipsTarget` from the `data/relationships/target` response property `oneOf` list for the response status `200`
**2024-01-23**
**GET - `/orgs/{org_id}/issues` - Added**
* Get a list of an organization\`s issues.
**GET - `/orgs/{org_id}/issues/{issue_id}` - Added**
* Get an issue
**GET - `/groups/{group_id}/issues` - Added**
* Get a list of a group\`s issues.
**GET - `/groups/{group_id}/issues/{issue_id}` - Added**
* Get an issue
**2024-01-04**
**POST - `/custom_base_images` - Updated**
* removed `#/components/schemas/VersioningSchemaDateType` from the `data/attributes/versioning_schema` request property `oneOf` list 
* removed `#/components/schemas/VersioningSchemaDateType` from the `data/attributes/versioning_schema` response property `oneOf` list for the response status `201`
**PATCH - `/custom_base_images/{custombaseimage_id}` - Updated**
* removed `#/components/schemas/VersioningSchemaDateType` from the `data/attributes/versioning_schema` request property `oneOf` list 
* removed `#/components/schemas/VersioningSchemaDateType` from the `data/attributes/versioning_schema` response property `oneOf` list for the response status `200`
**GET - `/custom_base_images/{custombaseimage_id}` - Updated**
* removed `#/components/schemas/VersioningSchemaDateType` from the `data/attributes/versioning_schema` response property `oneOf` list for the response status `200`
**2023-11-06**
**DELETE - `/orgs/{org_id}/projects/{project_id}` - Added**
* Delete one project in the organization by project ID.
**2023-11-03**
**GET - `/self/apps/{app_id}/sessions` - Added**
* Get a list of active OAuth sessions for the app.
**DELETE - `/self/apps/{app_id}/sessions/{session_id}` - Added**
* Revoke an active user app session.
**GET - `/self/apps/installs` - Added**
* Get a list of apps installed for an user.
**DELETE - `/self/apps/installs/{install_id}` - Added**
* Revoke access for an app by install ID.
**POST - `/orgs/{org_id}/apps` - Updated**
* added the new required request property `data` 
* removed the request property `access_token_ttl_seconds` 
* removed the request property `context` 
* removed the request property `name` 
* removed the request property `redirect_uris` 
* removed the request property `scopes` 
**GET - `/orgs/{org_id}/apps` - Updated**
* the response property `data/items/attributes/client_id` became optional for the status `200` 
* the response property `data/items/attributes/redirect_uris` became optional for the status `200` 
**PATCH - `/orgs/{org_id}/apps/{client_id}` - Updated**
* added the new required request property `data` 
* the response property `data/attributes/client_id` became optional for the status `200` 
* the response property `data/attributes/redirect_uris` became optional for the status `200` 
* removed the request property `access_token_ttl_seconds` 
* removed the request property `name` 
* removed the request property `redirect_uris` 
**GET - `/orgs/{org_id}/apps/{client_id}` - Updated**
* the response property `data/attributes/client_id` became optional for the status `200` 
* the response property `data/attributes/redirect_uris` became optional for the status `200` 
**POST - `/orgs/{org_id}/apps/installs` - Added**
* Install a Snyk Apps to this organization, the Snyk App must use unattended authentication eg client credentials.
**GET - `/orgs/{org_id}/apps/installs` - Added**
* Get a list of apps installed for an organization.
**DELETE - `/orgs/{org_id}/apps/installs/{install_id}` - Added**
* Revoke app authorization for an Snyk Organization with install ID.
**POST - `/orgs/{org_id}/apps/installs/{install_id}/secrets` - Added**
* Manage client secret for non-interactive Snyk App installations.
**POST - `/orgs/{org_id}/apps/creations` - Added**
* Create a new Snyk App for an organization.
**GET - `/orgs/{org_id}/apps/creations` - Added**
* Get a list of apps created by an organization.
**PATCH - `/orgs/{org_id}/apps/creations/{app_id}` - Added**
* Update app creation attributes with App ID.
**GET - `/orgs/{org_id}/apps/creations/{app_id}` - Added**
* Get a Snyk App by its App ID.
**DELETE - `/orgs/{org_id}/apps/creations/{app_id}` - Added**
* Delete an app by its App ID.
**POST - `/orgs/{org_id}/apps/creations/{app_id}/secrets` - Added**
* Manage client secret for the Snyk App.
**POST - `/groups/{group_id}/apps/installs` - Added**
* Install a Snyk Apps to this group, the Snyk App must use unattended authentication eg client credentials.
**GET - `/groups/{group_id}/apps/installs` - Added**
* Get a list of apps installed for a group.
**DELETE - `/groups/{group_id}/apps/installs/{install_id}` - Added**
* Revoke app authorization for an Snyk Group with install ID.
**POST - `/groups/{group_id}/apps/installs/{install_id}/secrets` - Added**
* Manage client secret for non-interactive Snyk App installations.
**2023-11-02**
**GET - `/orgs/{org_id}/container_images` - Added**
* List instances of container image
**GET - `/orgs/{org_id}/container_images/{image_id}` - Added**
* Get instance of container image
**GET - `/orgs/{org_id}/container_images/{image_id}/relationships/image_target_refs` - Added**
* List instances of image target references for a container image
**2023-09-13**
**GET - `/orgs/{org_id}/projects` - Updated**
* added `#/components/schemas/Relationship, #/components/schemas/ProjectRelationshipsTarget` to the `data/items/relationships/target` response property `oneOf` list for the response status `200`
**PATCH - `/orgs/{org_id}/projects/{project_id}` - Updated**
* added `#/components/schemas/Relationship, #/components/schemas/ProjectRelationshipsTarget` to the `data/relationships/target` response property `oneOf` list for the response status `200`
**GET - `/orgs/{org_id}/projects/{project_id}` - Updated**
* added `#/components/schemas/Relationship, #/components/schemas/ProjectRelationshipsTarget` to the `data/relationships/target` response property `oneOf` list for the response status `200`
**2023-09-12**
**GET - `/orgs/{org_id}/projects` - Updated**
* removed `#/components/schemas/Relationship, #/components/schemas/ProjectRelationshipsTarget` from the `data/items/relationships/target` response property `oneOf` list for the response status `200`
**PATCH - `/orgs/{org_id}/projects/{project_id}` - Updated**
* removed `#/components/schemas/Relationship, #/components/schemas/ProjectRelationshipsTarget` from the `data/relationships/target` response property `oneOf` list for the response status `200`
**GET - `/orgs/{org_id}/projects/{project_id}` - Updated**
* removed `#/components/schemas/Relationship, #/components/schemas/ProjectRelationshipsTarget` from the `data/relationships/target` response property `oneOf` list for the response status `200`
**POST - `/orgs/{org_id}/collections` - Added**
* Create a collection
**GET - `/orgs/{org_id}/collections` - Added**
* Return a list of organization\`s collections with issues counts and projects count.
**PATCH - `/orgs/{org_id}/collections/{collection_id}` - Added**
* Edit a collection
**GET - `/orgs/{org_id}/collections/{collection_id}` - Added**
* Get a collection
**DELETE - `/orgs/{org_id}/collections/{collection_id}` - Added**
* Delete a collection
**POST - `/orgs/{org_id}/collections/{collection_id}/relationships/projects` - Added**
* Add projects to a collection by specifying an array of project ids
**GET - `/orgs/{org_id}/collections/{collection_id}/relationships/projects` - Added**
* Return a list of organization\`s projects that are from the specified collection.
**DELETE - `/orgs/{org_id}/collections/{collection_id}/relationships/projects` - Added**
* Remove projects from a collection by specifying an array of project ids
**2023-09-11**
**PATCH - `/orgs/{org_id}/settings/sast` - Added**
* Enable/Disable the Snyk Code settings for an org
**GET - `/orgs/{org_id}/audit_logs/search` - Added**
* Search audit logs for an Organization. Supported event types:
* api.access
* org.app\_bot.create
* org.app.create
* org.app.delete
* org.app.edit
* org.cloud\_config.settings.edit
* org.collection.create
* org.collection.delete
* org.collection.edit
* org.create
* org.delete
* org.edit
* org.ignore\_policy.edit
* org.integration.create
* org.integration.delete
* org.integration.edit
* org.integration.settings.edit
* org.language\_settings.edit
* org.notification\_settings.edit
* org.org\_source.create
* org.org\_source.delete
* org.org\_source.edit
* org.policy.edit
* org.project\_filter.create
* org.project\_filter.delete
* org.project.add
* org.project.attributes.edit
* org.project.delete
* org.project.edit
* org.project.fix\_pr.auto\_open
* org.project.fix\_pr.manual\_open
* org.project.ignore.create
* org.project.ignore.delete
* org.project.ignore.edit
* org.project.monitor
* org.project.pr\_check.edit
* org.project.remove
* org.project.settings.delete
* org.project.settings.edit
* org.project.stop\_monitor
* org.project.tag.add
* org.project.tag.remove
* org.project.test
* org.request\_access\_settings.edit
* org.sast\_settings.edit
* org.service\_account.create
* org.service\_account.delete
* org.service\_account.edit
* org.settings.feature\_flag.edit
* org.target.create
* org.target.delete
* org.user.add
* org.user.invite
* org.user.invite.accept
* org.user.invite.revoke
* org.user.invite\_link.accept
* org.user.invite\_link.create
* org.user.invite\_link.revoke
* org.user.leave
* org.user.provision.accept
* org.user.provision.create
* org.user.provision.delete
* org.user.remove
* org.user.role.create
* org.user.role.delete
* org.user.role.details.edit
* org.user.role.edit
* org.user.role.permissions.edit
* org.webhook.add
* org.webhook.delete
* user.org.notification\_settings.edit
**GET - `/groups/{group_id}/audit_logs/search` - Added**
* Search audit logs for a Group. Some Organization level events are supported as well as the following Group level events:
* api.access
* group.cloud\_config.settings.edit
* group.create
* group.delete
* group.edit
* group.notification\_settings.edit
* group.org.add
* group.org.remove
* group.policy.create
* group.policy.delete
* group.policy.edit
* group.request\_access\_settings.edit
* group.role.create
* group.role.delete
* group.role.edit
* group.service\_account.create
* group.service\_account.delete
* group.service\_account.edit
* group.settings.edit
* group.settings.feature\_flag.edit
* group.sso.add
* group.sso.auth0\_connection.create
* group.sso.auth0\_connection.edit
* group.sso.create
* group.sso.delete
* group.sso.edit
* group.sso.membership.sync
* group.sso.remove
* group.tag.create
* group.tag.delete
* group.user.add
* group.user.remove
* group.user.role.edit
**2023-09-07**
**POST - `/orgs/{org_id}/service_accounts` - Added**
* Create a service account for an organization. The service account can be used to access the Snyk API.
**GET - `/orgs/{org_id}/service_accounts` - Added**
* Get all service accounts for an organization.
**PATCH - `/orgs/{org_id}/service_accounts/{serviceaccount_id}` - Added**
* Update the name of an organization-level service account by its ID.
**GET - `/orgs/{org_id}/service_accounts/{serviceaccount_id}` - Added**
* Get an organization-level service account by its ID.
**DELETE - `/orgs/{org_id}/service_accounts/{serviceaccount_id}` - Added**
* Delete a service account in an organization.
**POST - `/orgs/{org_id}/service_accounts/{serviceaccount_id}/secrets` - Added**
* Manage the client secret of an organization service account by the service account ID.
**POST - `/groups/{group_id}/service_accounts` - Added**
* Create a service account for a group. The service account can be used to access the Snyk API.
**GET - `/groups/{group_id}/service_accounts` - Added**
* Get all service accounts for a group.
**PATCH - `/groups/{group_id}/service_accounts/{serviceaccount_id}` - Added**
* Update the name of a group\`s service account by its ID.
**GET - `/groups/{group_id}/service_accounts/{serviceaccount_id}` - Added**
* Get a group-level service account by its ID.
**DELETE - `/groups/{group_id}/service_accounts/{serviceaccount_id}` - Added**
* Permanently delete a group-level service account by its ID.
**POST - `/groups/{group_id}/service_accounts/{serviceaccount_id}/secrets` - Added**
* Manage the client secret of a group service account by the service account ID.
**2023-08-28**
**GET - `/orgs/{org_id}/projects` - Updated**
* added the new optional `query` request parameter `names_start_with`
* added the new optional `query` request parameter `target_file`
* added the new optional `query` request parameter `target_reference`
* added the new optional `query` request parameter `target_runtime`
**PATCH - `/orgs/{org_id}/projects/{project_id}` - Updated**
* deleted the `query` request parameter `user_id` 
**2023-08-21**
**POST - `/orgs/{org_id}/packages/issues` - Updated**
* added the optional property `meta` to the response with the `200` status
**POST - `/custom_base_images` - Added**
* In order to create a custom base image, you first need to import your base images into Snyk. You can do this through the CLI, UI, or API.
This endpoint marks an image as a custom base image. This means that the image will get added to the pool of images from which Snyk can recommend base image upgrades.
Note, after the first image in a repository gets added, a versioning schema cannot be passed in this endpoint. To update the versioning schema, the PATCH endpoint must be used.
**GET - `/custom_base_images` - Added**
* Get a list of custom base images with support for ordering and filtering. Either the org\_id or group\_id parameters must be set to authorize successfully.
**PATCH - `/custom_base_images/{custombaseimage_id}` - Added**
* Updates a custom base image\`s attributes
**GET - `/custom_base_images/{custombaseimage_id}` - Added**
* Get a custom base image
**DELETE - `/custom_base_images/{custombaseimage_id}` - Added**
* Delete a custom base image resource. (the related container project is unaffected)
**2023-06-22**
**GET - `/orgs/{org_id}/settings/sast` - Added**
* Retrieves the SAST settings for an org
**2023-05-29**
**GET - `/orgs` - Added**
* Get a paginated list of organizations you have access to.
**PATCH - `/orgs/{org_id}` - Added**
* Update the details of an organization
**GET - `/orgs/{org_id}` - Added**
* Get the full details of an organization.
**2023-04-28**
**POST - `/orgs/{org_id}/invites` - Updated**
* added the new required request property `data` 
* the `data/attributes/role` response`s property type/format changed from` string`/`` to` string`/`uuid`for status`201\` 
* removed the request property `email` 
* removed the request property `role` 
* added the new `org_invitation` enum value to the `data/type` response property for the response status `201` 
**GET - `/orgs/{org_id}/invites` - Updated**
* the `data/items/attributes/role` response`s property type/format changed from` string`/`` to` string`/`uuid`for status`200\` 
* added the new `org_invitation` enum value to the `data/items/type` response property for the response status `200` 
**2023-04-17**
**POST - `/orgs/{org_id}/packages/issues` - Added**
* This endpoint is not available to all customers. If you are interested please contact support. Query issues for a batch of packages identified by Package URL (purl). Only direct vulnerabilities are returned, transitive vulnerabilities (from dependencies) are not returned because they can vary depending on context.
**2023-03-20**
**GET - `/orgs/{org_id}/projects/{project_id}/sbom` - Added**
* This endpoint lets you retrieve the SBOM document of a software project. It supports the following formats:
* CycloneDX version 1.4 in JSON (set `format` to `cyclonedx1.4+json`).
* CycloneDX version 1.4 in XML (set `format` to `cyclonedx1.4+xml`).
* SPDX version 2.3 in JSON (set `format` to `spdx2.3+json`).
By default it will respond with an empty JSON:API response.
**2023-02-15**
**GET - `/orgs/{org_id}/projects` - Added**
* List all Projects for an Org.
**PATCH - `/orgs/{org_id}/projects/{project_id}` - Added**
* Updates one project of the organization by project ID.
**GET - `/orgs/{org_id}/projects/{project_id}` - Added**
* Get one project of the organization by project ID.
**2022-12-14**
**POST - `/orgs/{org_id}/slack_app/{bot_id}` - Added**
* Create new Slack notification default settings for a given tenant.
**GET - `/orgs/{org_id}/slack_app/{bot_id}` - Added**
* Get Slack integration default notification settings for the provided tenant ID and bot ID.
**DELETE - `/orgs/{org_id}/slack_app/{bot_id}` - Added**
* Remove the given Slack App integration
**GET - `/orgs/{org_id}/slack_app/{bot_id}/projects` - Added**
* Slack notification settings overrides for projects. These settings overrides the default settings configured for the tenant.
**POST - `/orgs/{org_id}/slack_app/{bot_id}/projects/{project_id}` - Added**
* Create Slack settings override for a project.
**PATCH - `/orgs/{org_id}/slack_app/{bot_id}/projects/{project_id}` - Added**
* Update Slack notification settings for a project.
**DELETE - `/orgs/{org_id}/slack_app/{bot_id}/projects/{project_id}` - Added**
* Remove Slack settings override for a project.
**2022-11-14**
**GET - `/orgs/{org_id}/invites` - Added**
* List pending user invitations to an organization.
**DELETE - `/orgs/{org_id}/invites/{invite_id}` - Added**
* Cancel a pending user invitations to an organization.
**2022-11-07**
**GET - `/orgs/{org_id}/slack_app/{tenant_id}/channels` - Added**
* Requires the Snyk Slack App to be set up for this org, will retrieve a list of channels the Snyk Slack App can access. Note that it is currently only possible to page forwards through this collection, no prev links will be generated and the ending\_before parameter will not function.
**GET - `/orgs/{org_id}/slack_app/{tenant_id}/channels/{channel_id}` - Added**
* Requires the Snyk Slack App to be set up for this org. It will return the Slack channel name for the provided Slack channel ID.
**2022-09-15**
**GET - `/orgs/{org_id}/packages/{purl}/issues` - Added**
* Query issues for a specific package version identified by Package URL (purl). Snyk returns only direct vulnerabilities. Transitive vulnerabilities (from dependencies) are not returned because they can vary depending on context.
**2022-06-01**
**POST - `/orgs/{org_id}/invites` - Added**
* Invite a user to an organization with a role.
**2022-03-11**
**GET - `/self/apps` - Added**
* Get a list of apps that can act on your behalf.
**DELETE - `/self/apps/{app_id}` - Added**
* Revoke access for an app by app id
**POST - `/orgs/{org_id}/apps` - Added**
* Create a new app for an organization. Deprecated, use /orgs/{org\_id}/apps/creations instead.
**GET - `/orgs/{org_id}/apps` - Added**
* Get a list of apps created by an organization. Deprecated, use /orgs/{org\_id}/apps/creations instead.
**PATCH - `/orgs/{org_id}/apps/{client_id}` - Added**
* Update app attributes. Deprecated, use /orgs/{org\_id}/apps/creations/{app\_id} instead.
**GET - `/orgs/{org_id}/apps/{client_id}` - Added**
* Get an App by client id. Deprecated, use /orgs/{org\_id}/apps/creations/{app\_id} instead.
**DELETE - `/orgs/{org_id}/apps/{client_id}` - Added**
* Delete an app by app id. Deprecated, use /orgs/{org\_id}/apps/creations/{app\_id} instead.
**POST - `/orgs/{org_id}/apps/{client_id}/secrets` - Added**
* Manage client secrets for an app. Deprecated, use /orgs/{org\_id}/apps/creations/{app\_id}/secrets instead.
**GET - `/orgs/{org_id}/app_bots` - Added**
* Get a list of app bots authorized to an organization. Deprecated, use /orgs/{org\_id}/apps/installs instead.
**DELETE - `/orgs/{org_id}/app_bots/{bot_id}` - Added**
* Revoke app bot authorization. Deprecated, use /orgs/{org\_id}/apps/installs/{install\_id} instead.
**2021-12-09**
**PATCH - `/orgs/{org_id}/settings/iac` - Added**
* Update the Infrastructure as Code Settings for an org.
**GET - `/orgs/{org_id}/settings/iac` - Added**
* Get the Infrastructure as Code Settings for an org.
**PATCH - `/groups/{group_id}/settings/iac` - Added**
* Update the Infrastructure as Code Settings for a group.
**GET - `/groups/{group_id}/settings/iac` - Added**
* Get the Infrastructure as Code Settings for a group.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker/advanced-configuration-for-snyk-broker-docker-installation/changing-the-auth-method-with-docker.md
# Changing the auth method with Docker
Each integration has an auth method set by default, with the exact method varying by service.
BitBucket Server and Data Center, for example, uses Basic Auth with a username and password in the `accept.json` file:
```json
{
"private": [
{
...,
"auth": {
"scheme": "basic",
"username": "${BITBUCKET_USERNAME}",
"password": "${BITBUCKET_PASSWORD}"
}
},
...
]
}
```
For Artifactory, the auth method is configured in the `.env` file by default:
```shell
# The URL to your artifactory
# If not using basic auth this will only be "/artifactory"
ARTIFACTORY_URL=:@/artifactory
```
For GitHub, the auth meth is part of the `origin` field in the `accept.json` file:
```json
{
"private": [
{
...,
"origin": "https://${GITHUB_TOKEN}@${GITHUB_API}"
},
...
]
}
```
You can override the authentication method. Valid values for `scheme` are `bearer`, `token`, and `basic`, which set the Authorization header to `Bearer`, `Token`, and `Basic` respectively. If a bearer token is preferred, the `accept.json` file can be configured as such:
```json
{
"private": [
{
...,
"auth": {
"scheme": "bearer",
"token": "${BEARER_TOKEN}"
}
},
...
]
}
```
Note that you must set this for every individual object in the `private` array.
If `scheme` is `bearer` or `token`, you must provide a `token`, and if `scheme` is `basic`, you must provide a `username` and `password`.
This overrides any other configured authentication method, for example, setting the token in the `origin` field, or in the `.env` file.
{% hint style="info" %}
Snyk Broker does not support authentication with mTLS method.
{% endhint %}
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/single-sign-on-sso-for-authentication-to-snyk/choose-a-provisioning-option.md
# Choose a provisioning option
{% hint style="warning" %}
Contact your Snyk account team or Snyk Support to turn on custom mapping once you have completed the setup steps.
{% endhint %}
Determine how new users in your company will get access to Snyk:
* [Open to all](#open-to-all)
* [Invitation required](#invitation-required)
* [Custom mapping](#custom-mapping)
## Open to all
With the Open option, all users have automatic access to Snyk by using SSO to log in. They will have access to all Organizations in your selected Group. Their accounts will all be provisioned with the same role, with two options:
* The **Org** **administrator** role allows all new users to manage other Org admins and collaborators, view Group reports, and work with Organizations within the Group as well as perform non-administrative tasks in the Organization.
* The **Org** **collaborator** role can perform non-administrative tasks in the Organization.
Let Snyk Support know whether new users will have the administrator role or the collaborator role for the Organization. The selected role applies for all users. For details, see [Pre-defined roles](https://docs.snyk.io/snyk-platform-administration/user-roles/pre-defined-roles).
## Invitation required
With the invitation required or Group Member option, admins can invite users or users can request access to an Organization.
There are two ways to invite users to Organizations. Invite members (see [Manage users in Organizations](https://docs.snyk.io/snyk-platform-administration/groups-and-organizations/organizations/manage-users-in-organizations)) or automate the process using the Snyk [API Invite users endpoint](https://docs.snyk.io/snyk-api/reference/organizations-v1#org-orgid-invite).
If users who have not been invited use SSO to log in, they will gain access to Snyk, but they will not see any Projects until an admin invites them or manually adds them to the applicable Organizations. You can provide a list of Organizations with the appropriate contact person so that new users can [request access](https://docs.snyk.io/snyk-platform-administration/groups-and-organizations/organizations/requests-for-access-to-an-organization).
## Custom mapping
{% hint style="info" %}
**Feature availability**\
Custom Mapping is available only for Enterprise plans. For more information, see [plans and pricing](https://snyk.io/plans/).
{% endhint %}
You can provision user accounts with customized rules using the [Custom Mapping Option](https://docs.snyk.io/implementation-and-setup/enterprise-setup/single-sign-on-sso-for-authentication-to-snyk/custom-mapping).
You can configure SSO differently for each of your different Groups. You can also map users to a specific Organization and role assignment based on information from the identity provider.
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/service-accounts/choose-a-service-account-type-to-use-with-snyk-apis.md
# Choose a service account type to use with Snyk APIs
For certain Snyk API calls you must authenticate using an Organization or Group service account.
For example, use a Group service account token to call Group API endpoints and Organization API endpoints for all Organizations in the Group.
The Snyk API uses the same roles as the Snyk Web UI applications. Each Snyk action requires a set of user permissions to be present. For details, see [User roles](https://docs.snyk.io/snyk-platform-administration/user-roles).
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-implementation-guide/phase-1-discovery-and-planning/choose-rollout-integrations.md
# Source: https://docs.snyk.io/implementation-and-setup/team-implementation-guide/phase-1-discovery-and-planning/choose-rollout-integrations.md
# Choose rollout integrations
## **SDLC integration points**
Snyk offers many integrations that seamlessly integrate into every stage of the SDLC.
Many businesses typically roll out automated solutions first, then slowly introduce tools to enable the developers. In addition, gating features are gradually turned on over a period of time to minimize disruption.
{% hint style="info" %}
As using multiple integrations can result in duplicate reporting of issues, you do not initially need to implement more than one integration type. For example, you can start by importing everything with Git repositories, then later use the CI/CD view for fine-grained detail (potentially removing the source control integration if both views are not desired).
{% endhint %}
## Integration types
Below are typical early integrations.
### Source Code Management (SCM) integrations
Integrations with popular version control platforms like GitHub, GitLab, Azure Repos, and Bitbucket seamlessly integrate Snyk security checks into the code review process. This ensures that potential vulnerabilities are identified and addressed before the code is merged into the main branch. Important features include:
* Daily testing/monitoring of a specified branch (typically "development" branch),
* (optional) Pull Request/Merge Request checks against any branch of the repository.
* (optional) Automated dependency upgrades and automated security fix upgrades using pull requests.
Advantages:
* Visibility into repository security posture
* Automatic Scan on code change
* Immediate feedback on issues for the developer
* Onboarding of repositories can be configured using the UI
* Supports Cloud Repositories on the Team plan
For more details, see [Git repositories (SCMs)](https://docs.snyk.io/developer-tools/scm-integrations/organization-level-integrations).
{% hint style="info" %}
If you have a non-cloud-facing or your own instance of a Git SCM:
* Consider deploying a [Snyk Broker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker) for Snyk to communicate with your repositories, which would also require a Snyk Enterprise Plan.
* Enterprise customers can enable and manage Snyk Broker using the API.
[Paid services](https://docs.snyk.io/snyk-data-and-governance/snyk-terms-of-support-and-services-glossary) can be engaged to assist in Broker deployments.
{% endhint %}
### Continuous Integration/Continuous Deployment (CI/CD) pipeline integrations
Integrating Snyk into CI/CD pipelines, such as Jenkins, Travis CI, or CircleCI, automates security checks during the build and deployment process. This ensures that vulnerabilities are detected early in the software development lifecycle and prevents their propagation into production. Typical features include:
* (Optional) Ability to passively monitor results during build and view results in Snyk
* (Optional) Ability to test and potentially break the build based on criteria you specified
* Integration can be achieved with specific Marketplace plugins or more generally, with the CLI as part of your pipeline script.
Advantages:
* Assess local code vulnerabilities
* Full control over testing (which tests to run, where in the build script)
* Can automate using CI/CD
For more details, see [Snyk CI/CD integrations](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations).
### Integrated Development Environment (IDE) integrations
IDE integrations like Visual Studio Code, IntelliJ IDEA, and Eclipse allow developers to access Snyk's security features directly within their coding environment. This enables real-time scanning and issue remediation as developers write code at the earliest possible stages.
For more details, see [Use Snyk in your IDE](https://docs.snyk.io/developer-tools/snyk-ide-plugins-and-extensions).
## Considerations for import strategies
Project Import Strategy Considerations Advantages Disadvantages CLI (automated CI/CD) Must be configured for each application within CI/CD. Can select what to test and when (i.e. which package managers, where in the process, which language to analyze. May need development effort for integration. It requires configuration per application. CLI (Run locally by user) User can use CLI to perform testing locally while working on an application, very configurable per scan type. Local use case Not meant for visibility or automation. Can require buildable code or dependencies to be installed (For example Gradle without Lockfile, Scala). Git Code Repository Integration Onboarding and daily monitoring: rapid vulnerability assessment across application portfolio. Continuous monitoring of repositories (even when you are not working on it). Centralized visibility for teams. Monitors specified branch Code does not need to be built. Can be initiated via UI Some languages/package managers have better resolution utilizing the CLI (Gradle without lockfile, Scala). Pull request (PR)/merge request (MR) scanning Immediate feedback on introduced issues on the PR/MR against any branch on repository. Configurable rules for pass/fail
## Additional considerations
### Infrastructure as Code
For Snyk Infrastructure as Code, it is common that your Terraform or YAML configuration files are held in your SCM, but they may be in a separate area or repository. As a result, consider if there are other areas you need to import. You may also want to integrate with Terraform Cloud (if applicable) to enable Snyk tests as part of your "Terraform run" processes.
For complex environments, modules, and highly templated implementations, utilizing the CLI on your Terraform Plan file may provide the best results.
### CR (Container Registries)
Snyk also integrates with various [Container Registries](https://docs.snyk.io/scan-with-snyk/snyk-container/container-registry-integrations) to enable you to import and monitor your containers for vulnerabilities. Snyk tests the containers you have imported for any known security vulnerabilities found at a frequency you control.
---
# Source: https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/snyk-ci-cd-integration-deployment-and-strategies/ci-cd-adoption-and-deployment.md
# CI/CD adoption and deployment
When deciding to use a Snyk integration, compare the advantages of source control management (SCM) integrations and CI/CD integrations.
## Typical stages in adopting CI/CD Integration
Developer teams typically adopt Snyk in the following stages (example commands are Snyk Open Source):
1. [Expose vulnerabilities](#stage-1-expose-vulnerabilities-snyk-monitor) (`snyk monitor`)
2. [Use Snyk as a gatekeeper](#stage-2-use-snyk-as-a-gatekeeper-snyk-test) (`snyk test`)
3. [Continuous monitoring](#stage-3-continuous-monitoring-snyk-test-and-snyk-monitor) (`snyk test` and `snyk monitor`)
### Stage 1: Expose vulnerabilities (`snyk monitor`)
A typical approach is to use Snyk results to expose vulnerabilities during the development process. This increases visibility of vulnerabilities among the members of your team.
When you first implement Snyk in your pipeline, Snyk recommends using only the `snyk monitor` command. If you use one of the Snyk CI plugins, Snyk recommends that you configure the plugin to NOT fail the build.
This is because all Projects have vulnerabilities, and after you set Snyk to fail the build, every build fails because of Snyk. This may cause problems with your team being quickly overwhelmed with failure messages.
Using `snyk monitor` to expose results provides information without disrupting processes.
For information about `snyk monitor`, see the [`monitor` command help](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor).
### Stage 2: Use Snyk as a gatekeeper (`snyk test`)
Using Snyk as a gatekeeper prevents the introduction of new vulnerabilities (sometimes referred to as "stopping the bleeding").
After your teams understand the vulnerabilities in their applications and develop a process for fixing them early in the development cycle, you can configure Snyk to fail your builds to prevent introducing vulnerabilities into your applications.
Add `snyk test` to your build or enable the fail functionality to make Snyk fail your builds, providing the results output to the console. Your developers or DevOps teams can use the results to decide whether to stop or continue the build.
For information about `snyk test`, see the [`test` command help](https://docs.snyk.io/developer-tools/snyk-cli/commands/test).
### Stage 3: Continuous monitoring (`snyk test` and `snyk monitor`)
After you configure Snyk to fail the build when vulnerabilities are detected, you can configure Snyk to send a snapshot of your Project's successful builds to Snyk for ongoing monitoring.
To do this, configure your pipeline to run `snyk monitor` if your `snyk test` returns a successful exit code.
## CI/CD deployment methods
{% hint style="info" %}
All of these methods provide the same results, as they all rely on the same Snyk engine. Thus, the same arguments or options apply regardless of the deployment method you select.
{% endhint %}
There are various ways to configure Snyk within your pipeline. Choose a method depending on your environment and preferences. You can expect all methods to lead to a successful run.
### **Use Snyk native plugins**
Snyk native plugins are available for most common CI/CD tools. You can use these plugins to set up and get started. The plugins include the most common parameters and options in the user interface.
### **Deploy the Snyk CLI using the npm method**
Follow steps similar to those for [installing the CLI](https://docs.snyk.io/developer-tools/snyk-cli/install-or-update-the-snyk-cli) locally. You must be able to run an npm command in the pipeline script. This method has the advantage of completely aligning with the CLI experience so that you can easily troubleshoot and configure.
### **Deploy the Snyk CLI binary version**
The advantage of the binary setup is that it has no dependency with the local environment. For example, binary setup is useful if you cannot run an npm command in your pipeline.
CLI binaries are available on the [CLI GitHub repository](https://github.com/snyk/cli/tags).
Snyk has Linux, Windows, and other versions.
### **Deploy a Snyk container**
You may also deploy Snyk in your pipeline using one of the Snyk images in [Docker Hub](https://hub.docker.com/r/snyk/snyk).
## Examples of Snyk CI/CD Integrations
This repository shows some examples of binary and npm integrations for various CI/CD tools: [CI/CD examples](https://github.com/snyk-labs/snyk-cicd-integration-examples).
---
# Source: https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/snyk-ci-cd-integration-deployment-and-strategies/ci-cd-setup.md
# CI/CD setup
## Prerequisites for CI/CD
To configure Snyk to run in a pipeline, retrieve key configuration inputs from your Snyk account.
### Target Organization
When you run Snyk in your CI/CD platform, you typically want to post the test results to Snyk for review and ongoing monitoring.
If you do not define a target Organization, Snyk uses the default Organization for your authentication token:
* For user accounts, this is the user's preferred Organization, configurable in the user's settings.
* For Organization service accounts, this is the Organization in which the account was created.
You can define the target Organization in the Snyk CLI by using the `--org` CLI option and either the `orgslugname` or the Organization ID:
* You can define the target Organization using its `orgslugname` as displayed in the address bar of the browser in the Snyk UI.
* Alternatively, you can define the target Organization using its Organization ID, found on the Organization settings page.
Organization ID
For more information, see [How to select the Organization to use in the CLI.](https://docs.snyk.io/developer-tools/snyk-cli/scan-and-maintain-projects-using-the-cli/how-to-select-the-organization-to-use-in-the-cli)
### Snyk authentication
For instructions on authenticating with Snyk, see [Authenticate the CLI with your account](https://docs.snyk.io/developer-tools/snyk-cli/authenticate-to-use-the-cli).
## Setting up Snyk to run in a pipeline
Snyk supports the following approaches to add tests to a build pipeline:
* **Snyk integration plugins**: Snyk provides pre-built plugins for several CI servers, including [Jenkins](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/jenkins-plugin-integration-with-snyk), [Team City](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/teamcity-jetbrains-integration-using-the-snyk-security-plugin), [Bitbucket Pipelines](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/bitbucket-pipelines-integration-using-a-snyk-pipe), and [Azure Pipelines](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/azure-pipelines-integration).
* **Snyk CLI:** Teams with more complex workflows or using a build system without a Snyk pre-built plugin, can use the Snyk CLI during CI/CD setups. For more informationm, see [Snyk test and snyk monitor in CI/CD integration](https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/snyk-ci-cd-integration-deployment-and-strategies/snyk-test-and-snyk-monitor-in-ci-cd-integration).
* **Snyk API**: For teams with complex requirements, Snyk provides an API, which you can use to automate functions including initiating scans, onboarding new Projects, and testing arbitrary libraries. See the [Snyk API documentation](https://docs.snyk.io/snyk-api/snyk-api) for details.
## Setting up CI/CD using Snyk CLI
The Snyk CLI is a NodeJS application that can be scripted directly by developers for easy integration into most CI/CD environments. The Snyk CLI is available as an npm application, pre-packaged binary, or container image. For more information, see [Install or update the Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/install-or-update-the-snyk-cli).
The Snyk CLI can be configured to:
* Return non-zero error codes only when certain criteria are met, for example, exit with an error code only if vulnerabilities of high severity are present.
* Output all of its data into JSON for more flexibility.
## Configure your continuous integration
To continuously avoid known vulnerabilities in your dependencies, integrate Snyk into your continuous integration (build) system. In addition to this documentation, see the [integration configuration examples](https://github.com/snyk-labs/snyk-cicd-integration-examples) in the Snyk Labs GitHub repository.
### Set up automatic monitoring
If you monitor a Project with Snyk, you are notified if the dependencies in your Project are affected by newly disclosed vulnerabilities. To ensure the list of dependencies Snyk has for your Open Source Project is up-to-date, refresh the list continuously by running `snyk monitor` in your deployment process. Configure your environment to include the `SNYK_TOKEN` environment variable. You can [find your API token ](https://docs.snyk.io/snyk-api/authentication-for-api)in your Snyk account settings.
### API token configuration
Ensure you do not check your personal Snyk API token into source control, to avoid exposing it to others. Instead, use CI environment variables to configure your token.
See the guidance for how to do this in the following documentation:
* [Travis](https://docs.travis-ci.com/user/environment-variables/)
* [Circle CI](https://circleci.com/docs/set-environment-variable/)
* [Codeship Basic](https://docs.cloudbees.com/docs/cloudbees-codeship/latest/basic-builds-and-configuration/set-environment-variables), [Codeship Pro](https://docs.cloudbees.com/docs/cloudbees-codeship/latest/pro-builds-and-configuration/environment-variables)
You can find additional documentation through a web search for `setting up env variables in CI`.
---
# Source: https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/snyk-ci-cd-integration-deployment-and-strategies/ci-cd-troubleshooting-and-resources.md
# CI/CD troubleshooting and resources
This page provides a few tips to help troubleshoot or scale CI/CD integrations.
### Step 1: Try to replicate with the Snyk CLI
If the CLI and the pipeline are running the same engine, try to clone the Project and scan with the CLI.
Try various CLI options. Use the Snyk CLI to find and fix known vulnerabilities as you run it in the pipeline. For more information, see the [CLI documentation](https://docs.snyk.io/developer-tools/snyk-cli).
### Step 2: Get logs
If you can replicate the issue using the Command Line Interface (CLI) and the problem still exists, consult the [Debugging the Snyk CLI ](https://docs.snyk.io/developer-tools/snyk-cli/debugging-the-snyk-cli)troubleshooting guidelines for capturing logs in a debug mode.
### Step 3: Use the CLI instead of the plugin
Try to replace the native plugin with the CLI by installing the CLI. See [Install the Snyk CLI ](https://docs.snyk.io/developer-tools/snyk-cli/install-or-update-the-snyk-cli)for instructions.
{% hint style="info" %}
The following repository provides some examples of binary and npm integrations for various CI/CD tools: [GitHub CI/CD examples](https://github.com/snyk-labs/snyk-cicd-integration-examples).
To learn more about CI/CD, see the article [What is CI/CD? CI/CD Pipeline and Tools Explained](https://snyk.io/learn/what-is-ci-cd-pipeline-and-tools-explained/).
{% endhint %}
---
# Source: https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/circleci-integration-using-a-snyk-orb.md
# CircleCI integration using a Snyk Orb
Snyk integrates with [CircleCI](https://circleci.com) using a Snyk Orb, seamlessly scanning your application dependencies and Docker images for open-source security vulnerabilities as part of the CI/CD workflow.
CircleCI enables users to easily create CI/CD workflows using a group of ready-to-use commands ([Orbs](https://circleci.com/orbs/)) that can be added to your configuration file.
With the Snyk Orb, you can quickly add Snyk scanning to your CI/CD to test and monitor for open-source vulnerabilities based on your configurations. Results are then displayed on the CircleCI output view and can also be monitored on [snyk.io](http://app.snyk.io).
## Implement the Snyk Orb
Refer to the following information about the Snyk Orb to get started with CircleCI, from implementation to a green build with Snyk.
* [Snyk Circle CI README](https://circleci.com/orbs/registry/orb/snyk/snyk) - includes all the information that you need to set up your CI/CD with Snyk, including a list of parameters and samples
* [Adding application and image scans to your CI/CD pipeline (Circle CI tutotial)](https://circleci.com/blog/adding-application-and-image-scanning-to-your-cicd-pipeline/) - discusses how to set up a secure pipeline with the Snyk Orb
## How CircleCI integration works
After you add a Project to CircleCI and add the Snyk Orb to the configuration file, every time a build runs, the Snyk Orb is also used and performs the following actions:
### Scan
1. Scans app dependencies or container images for vulnerabilities or open-source license issues, and lists the vulnerabilities and issues.
2. If Snyk finds vulnerabilities, it does one of the following, based on your configuration:
* Fails the build.
* Lets the build complete.
### **Monitor**
Optionally, if the build completes successfully and **MONITOR** is set to **True** in the Snyk step, then Snyk saves a snapshot of the Project dependencies from the Snyk Web UI. From the Snyk Web UI you can view the dependency tree displaying all of the issues, and you can receive alerts for new issues found in the existing app version.
## **Prerequisites for implementing CircleCI integration**
1. Create a Snyk account and retrieve the **Snyk API token** from your **Account settings**.
2. Import the relevant repository into CircleCI.
3. Go to `Settings -> Security -> Orb security settings` and ensure you allow `opt-in to third party Orbs`.
4. Ensure your configuration (`config.yml`) file follows version 2.1.
5. Add the required environment variables to CircleCI, including the Snyk PAT or API token as `SNYK_TOKEN`.
## Getting Snyk Orb details from the CircleCI registry
On the [Orbs registry](https://circleci.com/orbs/registry/), CircleCI displays a list of available Orbs customized for you directly, similar to the list in the image that follows.
Snyk Orb for CircleCI
In this list, find and click the relevant **Snyk** line to view the Snyk Orb information with examples, parameters, and values:
{% hint style="info" %}
Be sure to use the latest version of the Snyk orb from the list.
{% endhint %}
Snyk Orb information
---
# Source: https://docs.snyk.io/developer-tools/snyk-ci-cd-integrations/snyk-images-guides-to-migration/circleci-migration.md
# CircleCI migration
This page explains how to transition away from affected jobs.
## Updating snyk orb and using `iac test`
Customers using the `scan-iac` job will need to switch to using `snyk/scan` with the `iac test` command. For an example, see the [IaC scanning examples](https://github.com/snyk/snyk-orb/blob/v2.0.0/src/examples/quickstart-iac-scanning.yml) in the snyk-orb repository.
It is important to update the version of the snyk orb used to the latest version by updating the circleci config files. Currently, the latest Snyk orb is `snyk/snyk@2.1.0` .
## Use the Snyk orb to only install the Snyk CLI, and then run the Snyk CLI commands in your own steps
Instead of relying on predefined jobs, customers can use the Snyk orb to install the Snyk CLI and then run commands as their own steps. See this [install example](https://github.com/snyk/snyk-orb/blob/v2.0.0/src/examples/only-install.yml) in the snyk-orb repository.
In the case of replacing `scan-iac` job, an example config could look like the following:
```yaml
version: 2.1
orbs:
node: circleci/node@5
snyk: snyk/snyk@2.1.0
jobs:
snyk_scan:
docker:
- image: cimg/node:lts
steps:
- checkout
- run: npm ci
- snyk/install
- run:
command: snyk iac test
name: Run iac test
```
## Direct CLI installation without using the Snyk orb
Alternatively, customers who prefer not to rely on the Snyk CircleCI orb or wish to have complete control over their pipelines can opt to install the Snyk CLI directly. An example follows:
```yaml
version: 2.1
jobs:
snyk_scan:
docker:
- image: cimg/node:lts
steps:
- checkout
- run: npm ci
- run:
name: Download Snyk CLI
command: |
curl -Lo snyk-linux https://downloads.snyk.io/cli/stable/snyk-linux
- run:
name: Download Snyk CLI SHA256 Checksum
command: |
curl -Lo snyk-linux.sha256 https://downloads.snyk.io/cli/stable/snyk-linux.sha256
- run:
name: Verify SHA256 Checksum
command: |
sha256sum -c snyk-linux.sha256
- run:
name: Install Snyk CLI
command: |
chmod +x snyk-linux
./snyk-linux --version
- run:
name: Run Snyk iac test
command: |
./snyk-linux iac test
workflows:
version: 2
build_and_scan:
jobs:
- snyk_scan
```
---
# Source: https://docs.snyk.io/discover-snyk/snyk-learn/your-learning/claiming-cpe-credits-with-snyk-learn.md
# Claiming CPE Credits with Snyk Learn
Snyk Learn offers a range of security education lessons and learning paths designed to enhance your application security skills. While these lessons provide valuable knowledge, they do not automatically grant Continuing Professional Education (CPE) credits. To claim CPE credits for your participation in Snyk Learn lessons and learning paths, follow these steps.
{% hint style="info" %}
The specific process for claiming CPE credits may vary depending on your certifying organization.
{% endhint %}
1. Complete a Snyk Learn module.
1. Log in to your Snyk Learn account and finish a lesson or learning path.
2. Ensure you meet all completion criteria, such as passing the quiz and engaging with interactive content.
2. Document your learning activity. Record essential details of the completed module, including:
* Course title
* Provider name: Snyk
* Completion date
* Duration: Time spent on the module
3. Submit to your certification body.
1. Access the CPE submission portal of your certifying organization, such as ISC2 or ISACA.
2. Enter the recorded details and provide any required documentation, such as completion certificates (available for Learning Paths) or screenshots of the completed lesson in your Learning Progress dashboard or the lesson page, see examples below.
4. Provide additional information if requested. You may be asked to supply further details or respond to inquiries from your certifying body to verify your CPE claim.
### Examples
#### Learning path certificate
Navigate to the page for the Learning Path that you completed and click the **Download certificate** button.
A completed Learning Path with certificate
#### Lesson completion example
---
# Source: https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker.md
# Classic Broker
This section provides information about the classic Snyk Broker:
* [Connections with Snyk Broker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/broker-inbound-and-outbound-connections-and-allowed-requests)
* [Prepare Snyk Broker for deployment](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/prepare-snyk-broker-for-deployment)
* [Install and configure Snyk Broker](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/install-and-configure-snyk-broker)
* [Clone an integration across your Snyk Organizations](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/clone-an-integration-across-your-snyk-organizations)
* [Snyk Broker - Container Registry Agent](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/snyk-broker-container-registry-agent)
* [Snyk Broker - Infrastructure as Code detection](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/classic-broker/snyk-broker-infrastructure-as-code-detection)
* [Snyk Broker - Essentials](https://docs.snyk.io/implementation-and-setup/enterprise-setup/snyk-broker/using-snyk-essentials-with-snyk-broker)
---
# Source: https://docs.snyk.io/manage-risk/policies/assets-policies/use-cases-for-policies/classification-policy.md
# Classification policy
You can use the **Set Asset Class** action from the Policies view to classify the assets based on importance, where class A is the most important and class D is the least important.
You can set the asset class based on:
* the repository name
* the asset labels
{% hint style="info" %}
Snyk Essentials identifies GitHub and GitLab topics as asset labels.
{% endhint %}
Use the classification policy to give business context to your application. When you set up a classification policy, all your assets are automatically classified.
If you just started using the classification policy, the recommendation is to focus first on the Class D assets, since they are the least important.
The following example filters the assets that contain `sandbox`, `test`, and `to-delete` in their names. In Snyk Essentials, GitHub and GitLab topics are pulled in from the SCM integration and applied to repository assets, so if topics like `PCI-Compliance` have been added to repos in the SCM, Snyk can take those tags in Snyk Essentials and classify those assets as Class A.
Assets Policy - Setting up filters for a classification policy
After you set up the filters, you need to apply a Class D asset classification to those assets.
Assets Policy - Setting up actions for a classification policy
You can apply a similar pattern and create actions for Class A, B, and C assets, within the same policy.
Assets Policy - Setting up multiple actions for a classification policy
---
# Source: https://docs.snyk.io/integrations/snyk-studio-agentic-integrations/quickstart-guides-for-snyk-studio/claude-code-guide.md
# Claude Code guide
You can access Snyk Studio in Claude Code to secure code generated with agentic workflows through an LLM. This can be achieved in several ways. When you use it for the first time, the MCP server will ask for trust and trigger authentication if necessary.
## Prerequisite
Install Claude Code. For more details, visit the official [Claude Code - Quickstart](https://docs.anthropic.com/en/docs/claude-code/quickstart).
## Install Snyk Studio
Install Snyk Studio using the method that best suits your operating system and local development environment. Snyk recommends leveraging the ['single command install' using `npx`](#install-with-npx) . For different ways to install MCP servers into Claude Code, see Anthropic's [official documentation](https://docs.anthropic.com/en/docs/claude-code/mcp#installing-mcp-servers).
### Install with `npx`
Open a terminal window and paste the following command:
`npx -y snyk@latest mcp configure --tool=claude-cli`
This command:
* Downloads the latest version of Snyk's CLI.
* Sets up Snyk Studio within Claude Code.
* Configures Snyk Studio's Secure at inception directives within Claude Code's global rules file.
To verify installation, use the `/mcp` command within Claude:
Select **View Tools** to look at all of the commands and tooling Snyk utilizes as part of its execution The descriptions also include instructions specific for the LLM. These are capitalized to help you differentiate. These tools include:
| Tool | Description |
| --------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `snyk_aibom` | Generates an AI Bill of Materials (AI-BOM) for Python software projects in CycloneDX v1.6 JSON format. This experimental feature analyzes local Python projects to identify AI models, datasets, tools, and other AI-related components. Requires an active internet connection and access to the experimental feature (available on request). The command must be run from within a Python project directory and requires the CLI from the preview release channel. |
| `snyk_auth` | Authenticate the user with Snyk. |
| `snyk_code_scan` | Performs Static Application Security Testing (SAST) directly from the Snyk MCP. It analyzes an application's source code with a SAST scan to identify security vulnerabilities and weaknesses without executing the code.
Supported languages:
Apex, C/C++, Dart and Flutter, Elixir, Go, Groovy, Java and Kotlin, Javascript, .NET, PHP, Python, Ruby, Rust, Scala, Swift and Objective-C, Typescript, VB.NET
|
| `snyk_container_scan` | Scans container images for known vulnerabilities in OS packages and application dependencies. |
| `snyk_iac_scan` | Analyzes Infrastructure as Code (IaC) files for security misconfigurations.
Supports Terraform (.tf, .tf.json, plan files), Kubernetes (YAML, JSON), AWS CloudFormation (YAML, JSON), Azure Resource Manager (ARM JSON), and Serverless Framework.
|
| `snyk_logout` | Logs the Snyk MCP out of the current Snyk account by clearing the locally stored authentication token. |
| `snyk_sbom_scan` | Experimental command. Analyzes an existing SBOM file for known vulnerabilities in its open-source components. Requires components in SBOM to be identified using PackageURLs (purls). |
| `snyk_sca_scan` | Analyzes Projects for open-source vulnerabilities and license compliance issues by inspecting manifest files (for example package.json, pom.xml, requirements.txt) to understand dependencies and then queries the Snyk vulnerability database. |
| `snyk_send_feedback` | Can be used to send feedback to Snyk. |
| `snyk_trust` | Trusts a given folder to allow Snyk to scan it. |
| `snyk_version` | Displays the installed Snyk MCP version. |
### Optional: Adjust scan frequency
Snyk recommends you use Snyk Studio with the Secure at inception directives, but also provides a smart scan option that allows the LLM to determine when to call Snyk Studio. This option results in lower overall token usage and faster iterating, but it increases the risk of insecure code being added to your codebase. Expand the options below for instructions on adjusting directives at installation or after installation.
Adjust at install
To utilize smart-scan from install, add the following argument to the npx install command:
`npx -y snyk@latest mcp configure --tool=claude-cli --rule-type=smart-apply`
Adjust after install
The default ruleset frequency can be adjusted by editing the global `CLAUDE.md` file.
For reference, the following are the smart apply rules Snyk places in Claude's global rules file when prompted:
{% code overflow="wrap" %}
```
BEFORE declaring task complete: Run snyk_code_scan tool when a significant change has been made in first party code.
- This should only apply for Snyk-supported coding language
- If any security issues are found based on newly introduced or modified code or dependencies, attempt to fix the issues using the results context from Snyk.
- Rescan the code after fixing the issues to ensure that the issues were fixed and that there are no newly introduced issues.
- Repeat this process until no new issues are found.
```
{% endcode %}
### Alternate installation methods
Expand the relevant method below for installation instructions.
Install with Node.js and npx
Create or edit the MCP configuration file `~/.claude.json`.
If you have the Node.js `npx` executable installed in your environment, add the following JSON snippet to the file:
{
"mcpServers": {
"Snyk": {
"type": "stdio",
"command": "npx",
"args": ["-y", "snyk@latest", "mcp", "-t", "stdio"],
"env": {}
}
}
}
Install with Snyk CLI
Create or edit the MCP configuration file `~/.claude.json` .
If you have the Snyk CLI installed and accessible on your system path, include the following JSON snippet in the file. You might need to specify the full path to the Snyk executable CLI:
```
{
"mcpServers": {
"Snyk": {
"type": "stdio",
"command": "/absolute/path/to/snyk",
"args": ["mcp", "-t", "stdio"],
"env": {}
}
}
}
```
If the `snyk` command is not available, add it by following the instructions on the [Installing or updating the Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/install-or-update-the-snyk-cli) page.
Install with Claude Code CLI commands
Run the Snyk MCP Server in `sse` transport mode using the Snyk CLI:
```
snyk mcp -t sse
```
Then run the Claude Code CLI for adding a new MCP server:
```
claude mcp add --transport sse snyk http://127.0.0.1:7695/sse
```
Watch this step-by-step [YouTube video](https://www.youtube.com/watch?v=qAcTPMXbfyU). showing how to configure the Snyk MCP Server with Claude Code CLI.
If the `snyk` command is not available, add it by following the instructions on the [Installing or updating the Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/install-or-update-the-snyk-cli) page.
The following examples shows a Snyk MCP Server that was successfully configured and started.
For additional MCP configuration options on Claude Code and troubleshooting, visit the official [Claude Code MCP documentation](https://docs.anthropic.com/en/docs/claude-code/mcp).
### Authenticate the Snyk MCP Server
As a one-time setup, you need to authenticate and trust the current Project directory.
## Uninstall Snyk Studio
To remove Snyk Studio and its associated ruleset from your environment, use:
```
snyk mcp configure --rm --tool=claude-cli
```
## Secure at inception directives
To practice [Secure at Inception](https://docs.snyk.io/discover-snyk/getting-started/glossary#secure-at-inception), Snyk recommends adopting rules that align the LLM for secure code generation and workflows.
The following is a suggested instruction for creating project rules for use with [Claude Code md files](https://www.anthropic.com/engineering/claude-code-best-practices). Append the following contents to the rules file at `CLAUDE.md`:
{% code title="CLAUDE.md" overflow="wrap" %}
```
- Always run **snyk_code_scan** tool for new first party code that is generated in a Snyk-supported language.
- If any security issues are found based on newly introduced or modified code or dependencies, attempt to fix the issues using the results context from **Snyk**.
- Rescan the code after fixing the issues to ensure that the issues were fixed and that there are no newly introduced issues.
- Repeat this process until no new issues are found.
```
{% endcode %}
---
# Source: https://docs.snyk.io/developer-tools/snyk-cli/cli-commands-and-options-summary.md
# CLI commands and options summary
{% hint style="info" %}
This page only summarizes the CLI commands and the options for each command. For details, use the links in this summary to open the help docs page for the command you are using. The help docs pages are the same as the help in the CLI.
{% endhint %}
## Usage
`snyk [COMMAND] [SUBCOMMAND] [OPTIONS] [PACKAGE] [CONTEXT-SPECIFIC-OPTIONS]`
## Description
The Snyk CLI is a build-time tool to find and fix known vulnerabilities in your projects. For a more detailed description of Snyk CLI and Snyk, see [Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli). For an introduction on how to use the Snyk CLI, see [Getting started with the CLI](https://docs.snyk.io/developer-tools/snyk-cli/getting-started-with-the-snyk-cli).
## Available CLI commands
To learn more about each Snyk CLI command, use the `--help` option, for example, `snyk auth --help` or `snyk container --help`. Each command in this list is linked to the corresponding help page in these docs.
**Note:** Lists of all the options for Snyk CLI commands are on this page. The options are explained in detail in the help for each command.
### [`snyk auth`](https://docs.snyk.io/developer-tools/snyk-cli/commands/auth)
Authenticate Snyk CLI with a Snyk account.
### [`snyk config`](https://docs.snyk.io/developer-tools/snyk-cli/commands/config)
Manage Snyk CLI configuration.
### [`snyk config environment`](https://docs.snyk.io/developer-tools/snyk-cli/commands/config-environment)
Use to set your environment for the region before you run the `snyk auth` command.
### [`snyk test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test)
Test a Project for open-source vulnerabilities and license issues.
### [`snyk monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor)
Snapshot and continuously monitor a project for open-source vulnerabilities and license issues.
### [`snyk code`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code)
Print the name of the `snyk code` command with its help option: `snyk code test`
### [`snyk code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test)
Test source code for any known security issues (Static Application Security Testing).
### [`snyk container`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container)
Print a list of the `snyk container` commands, `snyk container monitor` and `snyk container test`.
### [`snyk container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor)
Capture the container image layers and dependencies and monitor for vulnerabilities on [snyk.io](https://snyk.io).
### [`snyk container SBOM`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-sbom)
Generate an SBOM for a container image
### [`snyk container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test)
Test container images for any known vulnerabilities.
### [`snyk iac`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac)
Print a list of the `snyk iac` commands: `snyk iac describe`, `snyk iac update-exclude-policy`, and `snyk iac test`.
### [`snyk iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
Test for any known security issue.
### [`snyk iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
Detect, track, and alert on unmanaged resources.
### [`snyk iac update-exclude-policy`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-update-exclude-policy)
Generate exclude policy rules to be used by `snyk iac describe`.
### [`snyk ignore`](https://docs.snyk.io/developer-tools/snyk-cli/commands/ignore)
Modify the `.snyk` policy to ignore stated issues.
### [`snyk log4shell`](https://docs.snyk.io/developer-tools/snyk-cli/commands/log4shell)
Find Log4Shell vulnerability.
### [`snyk policy`](https://docs.snyk.io/developer-tools/snyk-cli/commands/policy)
Display the `.snyk` policy for a package.
### [`snyk sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom)
Generate an SBOM for a local software project in an ecosystem supported by Snyk.
### [`snyk sbom test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom-test)
Check an SBOM for vulnerabilities in open-source packages.
### [`snyk aibom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/aibom)
Generates an AI-BOM for a local software project that is written in Python, to help you understand what AI models, datasets, tools, and so on are used in that project.
### [`snyk apps`](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis/create-a-snyk-app-using-the-snyk-cli)
Create a Snyk App using the Snyk CLI. For more information, see [Snyk Apps](https://docs.snyk.io/snyk-api/using-specific-snyk-apis/snyk-apps-apis).
## Subcommands of CLI commands
The following is a list of the sub-commands for Snyk CLI commands. Each sub-command is followed by the command(s) to which the sub-command applies. The commands are linked to their help docs. For details concerning each sub-command, see the help docs.
`get `: subcommand of [`config`](https://docs.snyk.io/developer-tools/snyk-cli/commands/config)
`set =`: subcommand of [`config`](https://docs.snyk.io/developer-tools/snyk-cli/commands/config)
`unset `: subcommand of [`config`](https://docs.snyk.io/developer-tools/snyk-cli/commands/config)
`clear`: subcommand of [`config`](https://docs.snyk.io/developer-tools/snyk-cli/commands/config)
`environment`: subcommand of [`config`](https://docs.snyk.io/developer-tools/snyk-cli/commands/config)
## Configure the Snyk CLI
You can use environment variables to configure the Snyk CLI and also set variables to configure the Snyk CLI to connect with the Snyk API. See [Configure the Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/configure-the-snyk-cli).
## Debug
See [Debugging the Snyk CLI](https://docs.snyk.io/developer-tools/snyk-cli/debugging-the-snyk-cli) for detailed information about the `--d` option.
## Exit codes for CLI commands
Exit codes for the `test` commands are all the same. See the exit codes in the following help docs:
* [`snyk test` exit codes](https://docs.snyk.io/developer-tools/commands/test#exit-codes)
* [`snyk container test` exit codes](https://docs.snyk.io/developer-tools/commands/container-test#exit-codes)
* [`snyk iac test` exit codes](https://docs.snyk.io/developer-tools/commands/iac-test#exit-codes)
* [`snyk code test` exit codes](https://docs.snyk.io/developer-tools/commands/code-test#exit-codes)
Additional CLI commands have exit codes as listed in the following help docs:
* [`snyk monitor` exit codes](https://docs.snyk.io/developer-tools/commands/monitor#exit-codes)
* [`snyk container monitor` exit codes](https://docs.snyk.io/developer-tools/commands/container-monitor#exit-codes)
* [`snyk iac describe` exit codes](https://docs.snyk.io/developer-tools/commands/iac-describe#exit-codes)
* [`snyk iac update-exclude-policy` exit codes](https://docs.snyk.io/developer-tools/commands/iac-update-exclude-policy#exit-codes)
* [`snyk log4shell` exit codes](https://docs.snyk.io/developer-tools/commands/log4shell#exit-codes)
* [`snyk sbom` exit codes](https://docs.snyk.io/developer-tools/commands/sbom#exit-codes)
* [`snyk sbom test` exit codes](https://docs.snyk.io/developer-tools/commands/sbom-test#exit-codes)
* [`snyk container sbom` exit codes](https://docs.snyk.io/developer-tools/commands/container-sbom#exit-codes)
## Options for multiple commands
Lists of the options for Snyk CLI commands follow. Each option is followed by the command(s) to which the option applies. The commands are linked to their help docs. For details concerning each option, see the [help docs](https://docs.snyk.io/developer-tools/snyk-cli/commands).
`--all-projects`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom)
`--fail-fast`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor)
`--detection-depth=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test), [`sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom)
`--exclude=[,]...>`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom)
`--prune-repeated-subdependencies`, `-p`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom)
`--print-deps`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test)
`--remote-repo-url=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
`--dev`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom)
`--org=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test), [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test), [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe), [`sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom), [`container sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-sbom), [`aibom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/aibom)
`--file=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor),[`sbom test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom-test)
`--file=`: [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor), [`sbom test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom-test)
`--package-manager=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor)
`--unmanaged`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor). See also [Options for scanning using `--unmanaged`](#options-for-scanning-using-unmanaged) and the [`sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom) command help for another use of this option.
`--ignore-policy`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test), [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--trust-policies`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor)
`--show-vulnerable-paths=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test)
`--project-name=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor)
`--target-reference=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor)
`--policy-path=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test), [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe), [`ignore`](https://docs.snyk.io/developer-tools/snyk-cli/commands/ignore)
`--json`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test), [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test), [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe), [`sbom test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom-test)
`--json-file-output=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test), [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test), [`sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom)
`--sarif`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test), [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
`--sarif-file-output=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test), [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
`--severity-threshold=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test), [`code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test), [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test),[`sbom test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom-test)
`--fail-on=`: [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test)
`--project-environment=[,]...>`: [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
`--project-lifecycle=[,]...>`: [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
`--project-business-criticality=[,]...>`: [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
`--project-tags=[,]...>`: [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor), [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
`--tags=[,]...>`: [`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor)
`--reachability=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test),[`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`sbom test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom-test)
`--source-dir`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test),[`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`sbom test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom-test)
`--reachability-filter=`: [`test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/test),[`monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/monitor), [`sbom test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom-test)
## `snyk aibom` command options
`--html`\
`--json-file-output`: [snyk aibom](https://docs.snyk.io/developer-tools/snyk-cli/commands/aibom)
## `snyk auth` command options
`--auth-type=`\
`--client-secret=`\
`--client-id=`: [`snyk auth`](https://docs.snyk.io/developer-tools/snyk-cli/commands/auth)
## `snyk code test` command option
`--include-ignores`: [`code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test)
`--report`: [`code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test)
`--target-name=`: [`code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test)
`--report`: [`code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test)
`--target-name=`: [`code test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/code-test)
## `snyk config environment` command option
`--no-check` [snyk config environment](https://docs.snyk.io/developer-tools/snyk-cli/commands/config-environment)
## `snyk container` command options
`--app-vulns`: [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor)
`--exclude-app-vulns`: [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor), [`container sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-sbom)
`--nested-jars-depth`: [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor)
`--exclude-base-image-vulns`: [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor)
`--platform=`: [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor) ; [`container sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-sbom)
`--username=`: [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor)
`--password=`: [`container test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-test), [`container monitor`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-monitor)
## `snyk iac test` command options
`--scan=`: [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
`--target-name=`: [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
`--rules=`: [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
`--var-file=`: [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
`--report`: [`iac test`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-test)
## `snyk iac describe` command options
`--from=[,...]`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--to=`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--service=[,`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--quiet`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--filter`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--html`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--html-file-output=`: [`iac-describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--fetch-tfstate-headers`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--tfc-token`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--tfc-endpoint`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--tf-provider-version`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--strict`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--deep`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
`--tf-lockfile`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
-`-config-dir`: [`iac describe`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-describe)
## `snyk iac update-exclude-policy` command options
`--exclude-changed`: [`iac update-exclude-policy`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-update-exclude-policy)
`--exclude-missing`: [`iac update-exclude-policy`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-update-exclude-policy)
`--exclude-unmanaged`: [`iac update-exclude-policy`](https://docs.snyk.io/developer-tools/snyk-cli/commands/iac-update-exclude-policy)
## `snyk ignore` command options
`--id=`: [`ignore`](https://docs.snyk.io/developer-tools/snyk-cli/commands/ignore)
`--expiry=`: [`ignore`](https://docs.snyk.io/developer-tools/snyk-cli/commands/ignore)
`--reason=`: [`ignore`](https://docs.snyk.io/developer-tools/snyk-cli/commands/ignore)
`--path=`: [`ignore`](https://docs.snyk.io/developer-tools/snyk-cli/commands/ignore)
## `snyk sbom` and `snyk container sbom` command options
`--format=`: [`snyk sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom), [`snyk container sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/container-sbom)
`[--file=] or [--f=]`: [`snyk sbom`](https://docs.snyk.io/developer-tools/snyk-cli/commands/sbom)
`[--name=