Publish and link your build artifacts

You can download your artifacts directly from the pipeline result view. If you need to access your artifacts for longer than 14 days, there is a way to send your artifacts to 3rd-party storage and create a link in your commit view, using the Bitbucket build status API.

Once published and linked via the build status API, your artifact links will appear on your Bitbucket commit.

Step 1: Create an API token

API tokens created to access Bitbucket APIs or perform Git commands must have scopes.

  1. Select the Settings cog in the upper-right corner of the top navigation bar.

  2. Under Personal settings, select Atlassian account settings.

  3. Select the Security tab on the top navigation bar.

  4. Select Create and manage API tokens.

  5. Select Create API token with scopes.

  6. Give the API token a name and an expiry date, usually related to the application that will use the token and select Next.

  7. Select Bitbucket as the app and select Next.

  8. Select the scopes (permissions) the API token needs and select Next. For detailed descriptions of each permission, see: API Token permissions. Note: This step is required for your API token to access Bitbucket APIs or perform Git commands.

  9. Review your token and select the Create token button. The page will display the New API token.

  10. Copy the generated API token and either record or paste it into the application you want to give access.

The token is only displayed once and can't be retrieved later.

Step 2: Create a Pipelines variable with the authentication token

Define a new secure variable in your Pipelines settings:

  • Parameter name: BB_AUTH_STRING

  • Parameter value: <username>:<password> (using the values from step 1)

You can define this variable at either the repository or account level.

Step 3: Publish your artifacts to AWS

If you are new to AWS or S3, follow the instructions on our example S3 integration to create an S3 bucket and configure the relevant authentication variables in Bitbucket Pipelines.

python s3_upload.py <bucket-id> <artifact-file> <artifact-key>

Otherwise, you can use your existing AWS tooling to upload the artifact to an appropriate location.

With the variable and app password in place and your artifact published to S3, you can now use curl in your build script to link your artifact's S3 URL to your Bitbucket commit via the build status REST API:

export S3_URL="https://${S3_BUCKET}.s3.amazonaws.com/${S3_KEY_PREFIX}_${BITBUCKET_COMMIT}" export BUILD_STATUS="{\"key\": \"doc\", \"state\": \"SUCCESSFUL\", \"name\": \"Documentation\", \"url\": \"${S3_URL}\"}" curl -H "Content-Type: application/json" -X POST --user "${BB_AUTH_STRING}" -d "${BUILD_STATUS}" "https://api.bitbucket.org/2.0/repositories/${BITBUCKET_REPO_OWNER}/${BITBUCKET_REPO_SLUG}/commit/${BITBUCKET_COMMIT}/statuses/build"

Example bitbucket-pipelines.yml

Below is an example combining all the pieces in a sample Python project. You should adjust all the parameters in the examples to match your repository, and make sure you have all the necessary variables (including AWS authentication tokens) defined.

bitbucket-pipelines.yml

image: python:3.5.1 pipelines: branches: main: - step: script: - pip install boto3==1.3.0 # required for s3_upload.py - python run_tests.py - python s3_upload.py "${S3_BUCKET}" documentation.html "${S3_KEY_PREFIX}_${BITBUCKET_COMMIT}" # upload docs to S3 - export S3_URL="https://${S3_BUCKET}.s3.amazonaws.com/"${S3_KEY_PREFIX}_${BITBUCKET_COMMIT}" - export BUILD_STATUS="{\"key\":\"doc\", \"state\":\"SUCCESSFUL\", \"name\":\"Documentation\", \"url\":\"${S3_URL}\"}" - curl -H "Content-Type:application/json" -X POST --user "${BB_AUTH_STRING}" -d "${BUILD_STATUS}" "https://api.bitbucket.org/2.0/repositories/${BITBUCKET_REPO_OWNER}/${BITBUCKET_REPO_SLUG}/commit/${BITBUCKET_COMMIT}/statuses/build"

You can check your bitbucket-pipelines.yml file with our online validator.

Still need help?

The Atlassian Community is here for you.