Black Duck integration in Continuous integration (CI) tool
About Black Duck
Black Duck gives you visibility into and control over open source risks within your applications and containers. Black Duck allows you to scan applications and container images, identify all open source components, and detect any open source security vulnerabilities, compliance issues, or code-quality risks. By deploying Black Duck with any CI/CD integration, you can scan your cloud applications and images in your container registry, automate build scans in your CI pipeline, and stay notified on any security vulnerabilities or policy violations found in your open source code.
The Black Duck by Synopsys plugin for TFS and Azure DevOps allows automatic identification of open source security vulnerabilities during your application build process. The integration allows you to enforce policies configured in Black Duck to receive alerts and fail builds when policy violations are met.
Architecture
The following diagram shows the components in a Black Duck Binary Authorization Cloud Build setup:
The components are as follows:
- Cloud Source Repositories or another secure repository that contains the source code used to build a container image.
- Cloud Build, which runs builds and tests, and outputs the container image to a Container Registry or another software registry that stores your built images.
- Synopsys Black Duck Cloud Build Scan added to Cloud Build when new image versions are built and make an attestation if the image passes Black Duck policy.
- Container Analysis, which stores the attestations for Binary Authorization and the build records from Cloud Build.
- Binary Authorization, which enforces the policy requiring attestations by Black Duck before a container image can be deployed.
- Google Kubernetes Engine, which runs the deployed container images on Google Cloud Platform.
How Black duck Works
Black Duck Security Advisories help you avoid being caught off-guard by open source vulnerabilities, both in development and production. And they provide the critical data necessary to prioritize vulnerabilities for remediation, such as exploit info, remediation guidance, severity scoring, and call path analysis. Learn more about Black Duck’s vulnerability database.
TIMELY. Thousands of security feeds are monitored and enhanced to provide same-day notification of most vulnerabilities — weeks before they appear in the National Vulnerability Database.
ACCURATE. Our team of security experts review and verify vulnerability data to ensure accurate reporting on vulnerability descriptions, severity, exploit risk, and affected versions.
ACTIONABLE. Mitigation and remediation guidance detailed by our teams help prioritize vulnerabilities, select optimal patch or upgrade path, and identify evidence of attack or compromise.
AUTOMATED. Vulnerabilities are prioritized for remediation based on critical vulnerability data, such as severity, available solutions, exploitability, CWE, and call path analysis.
Black Duck Feature
- Scan code to identify specific open source in use
- Automatically map known vulnerabilities to open source in use
- Triage — assess risk and prioritise vulnerabilities
- Schedule and track remediation
- Identify licenses and community activity
- The most comprehensive language coverage and development tools integration
- Industry’s most complete open source KnowedgeBase
- Set policies in line with your business requirements
Black Duck Benefits
- Better manage security risks with open source within your code
- Understand what open source is in your code
- Better manage operational risks with open source within your code
- Better manage licensing risks with open source within your code
- Understand where Open Source code is located
- Understand what open source security/operational and licensing risks are present
Black Duck Integration with CI tools
There are so many CI tools are available in market. But here we will discuss about Azure-DevOps tools for that. If you want to integrate Black duck in any other tool then the Black Duck artifactory you can download from here.
Black Duck Integration with Azure-DevOps
The Synopsys Detect for Azure DevOps plugin, formerly known as Black Duck Detect plugin for TFS/VSTS, is architected to seamlessly integrate Synopsys Detect with Azure DevOps build and release pipelines. Synopsys Detect makes it easier to set up and scan code bases using a variety of languages and package managers.
The Synopsys Detect plugin for Azure DevOps supports native scanning in your Azure DevOps environment to run Software Composition Analysis (SCA) on your code.
As a Synopsys and Azure DevOps user, Synopsys Detect Extension for Azure DevOps enables you to:
- Run a component scan in an Azure DevOps job and create projects and releases in Black Duck through the Azure DevOps job.
- After a scan is complete, the results are available on the Black Duck server (for SCA).
Using the Synopsys Detect Extension for Azure DevOps together with Black Duck enables you to use Azure DevOps to automatically create Black Duck projects from your Azure DevOps projects.
Invoking Synopsys Detect
Synopsys recommends invoking Synopsys Detect from the CI (build) pipeline. Scanning during CI enables Synopsys Detect to break your application build, which is effective for enforcing policies like preventing the use of disallowed or vulnerable components.
Basic workflow
Using Synopsys Detect to analyze your code in Azure involves the following basic steps:
- Make sure you satisfy system and other requirements
- Download and configure the Synopsys Detect extension in Azure
- Configure build agent and pipeline
- Configure Black Duck connection
- Configure Synopsys Detect arguments
- Run pipeline and invoke scan
- Examine the analysis results
Requirements for Synopsys Detect in Azure DevOps
The following is a list of requirements for the Synopsys Detect in Azure DevOps integration.
- Black Duck server.
For the supported versions of Black Duck, refer to Black Duck Release Compatibility. - Black Duck API token to use with Azure.
- Azure DevOps Services or Azure DevOps Server 17 or later
- Java.
OpenJDK versions 8 and 11 are supported. Other Java development kits may be compatible, but only OpenJDK is officially supported for Synopsys Detect. - Internet access
The Synopsys Detect Extension for Azure DevOps is supported on the same operating systems and browsers as Black Duck.
For scanning NuGet projects, verify that you have the NuGet tool installer set up in the build job definition. You can download it at https://docs.microsoft.com/en-us/Azure DevOps/build-release/tasks/tool/nuget?view=Azure DevOps.
You can get the Synopsys Detect for Azure DevOps plugin at VisualStudio Marketplace.
Installing the Synopsys Detect for Azure DevOps Extension
From the Azure Pipelines page, add the Detect plug-in for ADO.
Install the Synopsys Detect extension for Azure DevOps
- Click the plus sign (+) under Tasks for Agent Job
- Search for the Synopsys Detect plugin and click Add to add it to your pipeline.
Configuring and Running the Plugin
After you install the plugin, you configure it in Pipeline task.
Configure your Synopsys Detect for Azure DevOps plugin by adding configuration for your Black Duck server and adding Detect arguments.
Configuring the plugin
- Navigate to Your Collection > Project > Pipelines > Tasks. The plugin adds a new task of Run Synopsys Detect for your build.
You must add this task to your build queue. - Click Run Synopsys Detect for your build, and the Synopsys Detect panel displays on the right. In the Synopsys Detect configuration panel, complete the following fields and options.
- Display name: Type a unique name in this field. Note that the name you type here displays in the left panel; the default name is Run Synopsys Detect for your build.
- Click+ New to add a new Black Duck Service Endpoint and then configure the details.
- Click+ New to add a new Black Duck Proxy Service Endpoint and then configure the details.
- Detect Version: Version of the Detect binary to use. Synopsys recommends using the latest; you can specify a version override if desired.
- Detect Run Mode: Select the run mode. If you select Use Airgap Mode, a Detect Air Gap Jar Directory Path field opens in which you must specify the Detect Air Gap Jar Path.
- Detect Arguments: Here you can include additional Detect arguments; Detect picks up your build environment variables and your project variables. Use a new line or space to separate multiple arguments. Use double quotes to escape. You can use environment and build variables. For more information on Detect arguments, refer to Synopsys Detect Properties.
- Detect Folder: The location to download the Detect jar or the location of an existing Detect jar. The default is the system temp directory. To specify a different directory, type the directory path and name in the field.
- Windows agents require an absolute path when specifying detect download location in the Detect Folder field.
- Add Detect Task Summary: Click this checkbox to add a summary of the Detect task to the build summary task.
In the user interface, fields with a red asterisk ( * ) are required. Some default values are provided, such as version. Note that the following fields belong to Azure DevOps, and are not part of the Detect plugin:
- Task version
- Display name
- Control Options
- Output Variables
Configuring a Build Agent
To configure a build agent in your pipeline do the following under the Tasks tab on your pipeline page.
The default option for the build agent is the Microsoft hosted agent. To be able to select a self-hosted agent, you must have installed the agent and ensure that it’s available to your project before you can use it in your pipeline. Click the ellipsis (…) next to Pipeline to Add an agent job.
- Click the ellipsis (…) next to Pipeline to Add an agent job.
2. On the Agent job configuration screen, do the following:
A- Select a self-hosted agent from your Agent pool or select Azure Pipelines for an Azure-hosted agent.
B- If you select a hosted agent, then you must select an operating system such as macOS, Windows, or a version of Linux for the hosted agent VM.
If the agent is behind a proxy you need to configure proxy settings in the Synopsys Detect plug-in.
Running the task
After you have configured your task, you can run it as follows.
- In Azure DevOps, click Queue, and your task is executed on the next available build agent.
- If your task configuration is incomplete, a red status message of Some settings need your attention displays below Run Synopsys Detect for your build. Missing required settings display in red in the Synopsys Detect panel.
Complete Black Duck Integration synopsis, You can download from here and full Black Duck Synapsys for 6.0.0 you can also download from here.
Add Black Duck task into Azure .yaml file
- Navigate to your team project on Azure DevOps in a new browser tab. Before digging into the YAML pipelines, you will want to disable the existing build pipeline.
- Navigate to Pipelines.
3. Select the existing project pipeline.
4. From the dropdown, select Pause pipeline.
Adding a YAML build definition
- Navigate to the Pipelines hub.
2. Click New pipeline. We will use the wizard to automatically create the YAML definition based on our project.
3. Select the Azure Repos Git as the source hosting platform. Note the others supported.
4. Select the Existing repo.
5. Select the ASP.NET template as the starting point for your pipeline.
6. Review the contents of the YAML definition. It will be saved as a new file called “azure-pipelines.yml” in the root of the repository and contain everything needed to build and test a typical ASP.NET solution. You can also customize the build as needed. In this case, update the pool to specify the build should use a Visual Studio 2017 build VM.
7. Review trigger and point to master if you repo does not have main (new repos will have “main” instead of “master”).
8. Click Save and run.
9. Click Save and run to confirm the commit.
10. Track the build until it completes. Click Job to see the logs.
11. Each task from the YAML file is available for review, including any warnings and errors.
12. Close the tasks view.
13. Select the Tests tab.
14. The tests should now succeed as expected.
Adding continuous delivery to the YAML definition
- Now that the build and test processes are successful, we can now add delivery to the YAML definition. From the options dropdown, select Edit pipeline.
2. Add the configuration lines below after the trigger section to define a Build stage in the YAML pipeline. You can define whatever stages you need to better organize and track pipeline progress.
stages:
- stage: Build
jobs:
- job: Build
3. Highlight(select) the remainder of the YAML file and indent it four spaces (two tabs). Everything after “pool” (included) should fall under “job: Build”. This will simply take the existing build definition and relocate it as a child of the jobs node.
4. At the bottom of the file, add the configuration below to define a second stage.
- stage: Deploy
jobs:
- job: Deploy
pool:
vmImage: 'vs2017-win2016'
steps:
5. Set the cursor on a new line at the end of the YAML definition. This will be the location where new tasks are added.
6. Select the Azure App Service Deploy task.
7. Select the Azure subscription where you created the app service earlier. Click Authorize and follow the path to complete authorization.
8. Enter the App Service name you used to create the app service earlier. Update the Package or folder to ”$(System.ArtifactsDirectory)/drop/*.zip”. Not $(System.DefaultWorkingDirectory)! . Click Add.
9. The YAML that defines the task will be added to the cursor location in the file.
10. With the added task still selected in the editor, indent it four spaces (two tabs) so that it is a child of the steps task.
- Note: The packageForLinux parameter is a bit misleading in the example but is valid for Windows or Linux. It’s an alias of Package, so it could be shortened to that
11. It’s important to note that these two stages will be run independently. As a result, the build output from the first stage will not be available to the second stage without special consideration. For this, we will use one task to publish the build output at the end of the build stage and another to download it in the beginning of the deploy stage. Place the cursor on a blank line at the end of the build stage.
12. Search the tasks for “publish build” and select the Publish Build Artifacts task. There may be more than one available, so be sure to select the one that is not deprecated.
13. Accept the defaults and click Add. This will publish the build artifacts to a location that will be downloadable under the alias drop.
14. Indent the publish task four spaces (two tabs). You may also want to add an empty line before and after to make it easier to read.
15. Place the cursor on the first line under the steps node of the deployment stage.
16. Search the tasks for “download build” and select the Download Build Artifacts task.
17. Click Add.
18. Indent the publish task four spaces (two tabs). You may also want to add an empty line before and after to make it easier to read.
19. Add a property to the download task specifying the artifactName of “drop”. Be sure to match the spacing.
artifactName: 'drop'
20. Click Save to commit the changes.
21. Confirm the Save. This will begin a new build.
22. Return to the Pipelines view.
23. From the Runs tab, click the new build run to open it. Note that there are now multiple stages shown based on the YAML definition edits from earlier.
24. If you see an error message requiring you need permission , click the view button to do so. Then click Permit twice.
25. Click the Deploy stage to follow each task.
26. Expand the AzureRmWebAppDeployment task to review the steps performed during the Azure deployment. Once the task completes, your app will be live on Azure.
Reviewing the deployed site
- Return to the Azure portal browser tab.
- Navigate to the app service created earlier.
- Go to the Overview tab.
4. Click Browse to open your site in a new tab.
5. The deployed site should load expected.
Read Azure-DevOps link for full Detail here. Or click on below link for more about it.
https://docs.veracode.com/r/Use_YAML_to_Add_Veracode_Analysis_to_Azure_DevOps_Pipelines
Reference
You can watch the following video that walks you through all the steps explained in this lab
Thanks for reading.