Skip to content
| Marketplace
Sign in
Azure DevOps>Azure Pipelines>Valint Cli
Valint Cli

Valint Cli

Scribe Security

|
18 installs
| (5) | Free
Command Line Interpreter (CLI) tool, that generates evidence the verifies the integrity of your supply chain.
Get it free

Azure Pipelines Task

Scribe offers users of Azure Pipelines to use DevOps Tasks for embedding evidence collection and integrity verification in their workflows.

Task in Azure DevOps is a feature that enables users to create, schedule, and manage tasks from a central location. Task provides users with an easy way to automate common workflows and activities in Azure DevOps. It provides several actions enabling the generation of SBOMs from various sources. The usage examples on this page demonstrate several use cases of SBOM collection (SBOM from a publicly available Docker image, SBOM from a Git repository, SBOM from a local directory) as well as several use cases of uploading the evidence either to the Azure DevOps pipelines or to the Scribe Service.

Installation

Valint-task Can be found in Azure marketplace.
Follow install-an-extension to add the extension to your organization.
Once you have the extension installed you can use the task in your pipeline.

Usage

  - job: scribe_azure_job
    displayName: scribe azure job
  
    pool:
      vmImage: 'ubuntu-latest'

    steps:
    - task: ScribeInstall@0
    - task: ValintCli@0
      displayName: SBOM image `busybox:latest`.
      inputs:
        command: bom
        target: busybox:latest
        outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint

Target types - [target]


Target types are types of artifacts produced and consumed by your supply chain. Using supported targets, you can collect evidence and verify compliance on a range of artifacts.

Format

[scheme]:[name]:[tag]

Sources target-type scheme Description example
Docker Daemon image docker use the Docker daemon docker:busybox:latest
OCI registry image registry use the docker registry directly registry:busybox:latest
Docker archive image docker-archive use a tarball from disk for archives created from "docker save" image docker-archive:path/to/yourimage.tar
OCI archive image oci-archive tarball from disk for OCI archives oci-archive:path/to/yourimage.tar
Remote git git git remote repository git git:https://github.com/yourrepository.git
Local git git git local repository git git:path/to/yourrepository
Directory dir dir directory path on disk dir:path/to/yourproject
File file file file path on disk file:path/to/yourproject/file

Evidence Stores

Each storer can be used to store, find and download evidence, unifying all the supply chain evidence into a system is an important part to be able to query any subset for policy validation.

Type Description requirement
scribe Evidence is stored on scribe service scribe credentials
OCI Evidence is stored on a remote OCI registry access to a OCI registry

Scribe Evidence store

Scribe evidence store allows you store evidence using scribe Service.

Related Flags:

Note the flag set:

  • scribeClientId
  • scribeClientSecret
  • scribeEnable

Before you begin

Integrating Scribe Hub with your environment requires the following credentials that are found in the Integrations page. (In your Scribe Hub go to integrations)

  • Client ID
  • Client Secret
Scribe Integration Secrets
  • Add the credentials to your Azure environment according to the Azure DevOps - Set secret variables.

  • Open your Azure DevOps project and make sure you have a YAML file named azure-pipelines.yml.

  • Use the Scribe custom task as shown in the example bellow

:::note Refer to this Azure document to configure runners. :::

Usage

trigger:
  branches:
    include:
    - main

jobs:
- job: scribe_azure_job
  displayName: 'Scribe Azure Job'
  pool:
    name: {Update pool name here}		# Example: Mikey
    agent: {Update agent name here}		# Example: azure-runner-ubuntu

  variables:
    imageName: 'pipelines-javascript-docker'

  steps:
  - task: scribeInstall@0

  - task: ValintCli@0
    inputs:
      command: bom
      target: nginx
      format: statement
      outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
      scribeEnable: true
      scribeClientId: $(CLIENTID)
      scribeClientSecret: $(CLIENTSECRET)

  - task: ValintCli@0
    inputs:
      command: verify
      target: nginx
      inputFormat: statement
      outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
      scribeEnable: true
      scribeClientId: $(CLIENTID)
      scribeClientSecret: $(CLIENTSECRET)

Alternative evidence stores

You can learn more about alternative stores here.

OCI Evidence store

Valint supports both storage and verification flows for attestations and statement objects utilizing OCI registry as an evidence store.

Using OCI registry as an evidence store allows you to upload, download and verify evidence across your supply chain in a seamless manner.

Related flags:

  • oci Enable OCI store.
  • ociRepo - Evidence store location.

Before you begin

Evidence can be stored in any accusable registry.

  • Write access is required for upload (generate).
  • Read access is required for download (verify).

You must first login with the required access privileges to your registry before calling Valint. For example, using docker login command.

Usage

- job: scribe_azure_job
  pool:
    vmImage: 'ubuntu-latest'

  variables:
    imageName: 'pipelines-javascript-docker'

  steps:
  - script: echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_USERNAME" --password-stdin [my_registry]

  - task: scribeInstall@0

  - task: ValintCli@0
    inputs:
      commandName: bom
      target: [target]
      format: [attest, statement, attest-slsa (depricated), statement-slsa (depricated), attest-generic, statement-generic]
      outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
      oci: true
      ociRepo: [oci_repo]

  - task: ValintCli@0
    inputs:
      commandName: verify
      target: [target]
      inputFormat: [attest, statement, attest-slsa, statement-slsa, attest-generic, statement-generic]
      outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
      oci: true
      ociRepo: [oci_repo]

Basic examples

Public registry image (SBOM)

Create SBOM for remote busybox:latest image.

- task: ValintCli@0
  displayName: Generate cyclonedx json SBOM
  inputs:
    commandName: bom
    target: busybox:latest
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    force: true
NTIA Custom metadata (SBOM)

Attach custom SBOM NTIA metadata.

trigger:
  branches:
    include:
    - main

jobs:
- job: scribe_azure_job
  displayName: 'Scribe Azure Job'
  pool:
    name: {Update pool name here}		# Example: Mikey
    agent: {Update agent name here}		# Example: azure-runner-ubuntu

  variables:
    imageName: 'pipelines-javascript-docker'
    # SBOM Author meta data - Optional
    AUTHOR_NAME: John-Smith
    AUTHOR_EMAIL: john@thiscompany.com
    AUTHOR_PHONE: 555-8426157
    # SBOM Supplier meta data - Optional
    SUPPLIER_NAME: Scribe-Security
    SUPPLIER_URL: www.scribesecurity.com
    SUPPLIER_EMAIL: info@scribesecurity.com
    SUPPLIER_PHONE: 001-001-0011

  steps:
  - task: scribeInstall@0

  - task: ValintCli@0
    inputs:
      command: bom
      target: nginx
      format: statement
      outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
      scribeEnable: true
      scribeClientId: $(CLIENTID)
      scribeClientSecret: $(CLIENTSECRET)
      author-name: $(AUTHOR_NAME)
      author-email: $(AUTHOR_EMAIL)
      author-phone: $(AUTHOR_PHONE)
      supplier-name: $(SUPPLIER_NAME)
      supplier-url: $(SUPPLIER_URL)
      supplier-email: $(SUPPLIER_EMAIL)
      supplier-phone: $(SUPPLIER_PHONE)

  - task: ValintCli@0
    inputs:
      command: verify
      target: nginx
      inputFormat: statement
      outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
      scribeEnable: true
      scribeClientId: $(CLIENTID)
      scribeClientSecret: $(CLIENTSECRET)
Public registry image (SLSA)

Create SLSA for remote busybox:latest image.

- task: ValintCli@0
  displayName: Generate SLSA provenance
  inputs:
    commandName: slsa
    target: busybox:latest
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    force: true
Docker built image (SBOM)

Create SBOM for image built by local docker image_name:latest image.

- task: ValintCli@0
  displayName: Generate cyclonedx json SBOM
  inputs:
    commandName: bom
    target: image_name:latest
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    force: true
Docker built image (SLSA)

Create SLSA for image built by local docker image_name:latest image.

- task: ValintCli@0
  displayName: Generate SLSA provenance
  inputs:
    commandName: slsa
    target: image_name:latest
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    force: true
Private registry image (SBOM)

Create SBOM for image hosted on private registry.

Use docker login task to add access.

- task: ValintCli@0
  displayName: Generate cyclonedx json SBOM
  inputs:
    commandName: bom
    target: scribesecurity.jfrog.io/scribe-docker-local/example:latest
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    force: true
Private registry image (SLSA)

Create SBOM for image hosted on private registry.

Use docker login task to add access.

- task: ValintCli@0
  displayName: Generate SLSA provenance
  inputs:
    commandName: slsa
    target: scribesecurity.jfrog.io/scribe-docker-local/example:latest
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    force: true
Custom metadata (SBOM)

Custom metadata added to SBOM.

- job: custom_bom
  displayName: Custom bom

  variables:
    - name: test_env
      value: test_env_value

  pool:
    vmImage: 'ubuntu-latest'

  steps:
  - task: ValintCli@0
    displayName: Generate cyclonedx json SBOM - add metadata - labels, envs, name
    inputs:
      commandName: bom
      target: 'busybox:latest'
      outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
      force: true
      env: test_env
      label: test_label
Custom metadata (SLSA)

Custom metadata added to SLSA.

- job: custom_slsa
  displayName: Custom slsa

  variables:
    - name: test_env
      value: test_env_value

  pool:
    vmImage: 'ubuntu-latest'

  steps:
  - task: ValintCli@0
    displayName: Generate cyclonedx json SBOM - add metadata - labels, envs, name
    inputs:
      commandName: slsa
      target: 'busybox:latest'
      outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
      force: true
      env: test_env
      label: test_label
Save as artifact SBOM

Using input variable outputDirectory or outputFile to export evidence as an artifact.

Use input variable format to select between supported formats.

- task: ValintCli@0
  displayName: SBOM image `busybox:latest`.
  inputs:
    command: bom
    target: busybox:latest
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    outputFile: $(Build.ArtifactStagingDirectory)/my_sbom.json
    force: true

# Using `outputDirectory` evidence cache dir
- publish: $(Build.ArtifactStagingDirectory)/scribe/valint
  artifact: scribe-evidence

# Using `outputFile` custom path.
- publish: $(Build.ArtifactStagingDirectory)/my_sbom.json
  artifact: scribe-sbom
Save as artifact SLSA

Using input variable outputDirectory or outputFile to export evidence as an artifact.

Use input variable format to select between supported formats.

- task: ValintCli@0
  displayName: SLSA image `busybox:latest`.
  inputs:
    command: slsa
    target: busybox:latest
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    outputFile: $(Build.ArtifactStagingDirectory)/my_slsa.json
    force: true

# Using `outputDirectory` evidence cache dir
- publish: $(Build.ArtifactStagingDirectory)/scribe/valint
  artifact: scribe-evidence

# Using `outputFile` custom path.
- publish: $(Build.ArtifactStagingDirectory)/my_slsa.json
  artifact: scribe-slsa
Directory target (SBOM)

Create SBOM from a local directory.

- bash: |
    mkdir testdir
    echo "test" > testdir/test.txt

- task: ValintCli@0
  displayName: SBOM local directory.
  inputs:
    command: bom
    target: dir:testdir
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    force: true
Directory target (SLSA)

Create SLSA from a local directory.

- bash: |
    mkdir testdir
    echo "test" > testdir/test.txt

- task: ValintCli@0
  displayName: SLSA local directory.
  inputs:
    command: slsa
    target: dir:testdir
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    force: true
Git target (SBOM)

Create SBOM for mongo-express remote git repository.

- task: ValintCli@0
  displayName: SBOM remote git repository.
  inputs:
    command: bom
    target: git:https://github.com/mongo-express/mongo-express.git 
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    force: true

Create SBOM for local git repository.

When using implicit checkout note the Azure-DevOps git-strategy will effect the commits collected by the SBOM.

- checkout: self

- task: ValintCli@0
  displayName: SBOM local git repository.
  inputs:
    command: bom
    target: git:. 
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    force: true
Git target (SLSA)

Create SLSA for mongo-express remote git repository.

- task: ValintCli@0
  displayName: SBOM remote git repository.
  inputs:
    command: slsa
    target: git:https://github.com/mongo-express/mongo-express.git 
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    force: true

Create SBOM for local git repository.

When using implicit checkout note the Azure-DevOps git-strategy will effect the commits collected by the SBOM.

- checkout: self

- task: ValintCli@0
  displayName: SLSA local git repository.
  inputs:
    command: slsa
    target: git:. 
    outputDirectory: $(Build.ArtifactStagingDirectory)/scribe/valint
    force: true

Resources

If you're new to Azure pipelines these links should help you get started:

  • What is an Azure Pipelines?.
  • Key concepts for new Azure Pipelines users.
  • Getting started with Azure Pipelines.
  • Create your first Azure pipeline.
  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft