Skip to main content

Pushing an Image to ECR from BitBucket

Recently, while updating my portfolio, I wanted to push a Docker image to AWS ECR (Elastic Container Registry) using a Bitbucket pipeline. It's something I regularly do at work, but I hadn’t set it up for a while, so I decided to document the process for easy reference.

The process is straightforward, but it involves a few key steps, including setting up authentication and configuring a pipeline. Here's how you can do it:

Step 1: Authenticate Bitbucket to AWS ECR

To push an image to AWS ECR from Bitbucket, you'll first need to authenticate Bitbucket to AWS. This requires creating an IAM user with the necessary permissions to access and push images to ECR.

Create an IAM Policy

Start by creating a custom IAM policy that allows specific actions on ECR. Here's an example of the permissions you need:


{

    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "ecr:CompleteLayerUpload",
                "ecr:GetAuthorizationToken",
                "ecr:UploadLayerPart",
                "ecr:InitiateLayerUpload",
                "ecr:BatchCheckLayerAvailability",
                "ecr:PutImage"
            ],
            "Resource": "*"
        }
    ]
}

Create an IAM User

Next, create an IAM user and attach the policy above to give the user permission to push images to ECR. After that, generate an access key for the user, which you'll need for authentication.

Step 2: Add Access Keys to Bitbucket

Now, head over to Bitbucket and add the newly created access key (AWS Access Key ID and AWS Secret Access Key) to your workspace variables. This ensures that your pipeline can authenticate with AWS securely.

To do this:

Go to your Bitbucket repository.

Navigate to Repository Settings > Pipelines > Environment Variables.

Add the following variables:

AWS_ACCESS_KEY_ID

AWS_SECRET_ACCESS_KEY

AWS_DEFAULT_REGION

Step 3: Set Up the Bitbucket Pipeline

The final step is to create the Bitbucket pipeline. You'll need to add a bitbucket-pipelines.yml file to your repository. This file defines the steps Bitbucket will take to build and push your Docker image to ECR.

Here’s a sample pipeline configuration:


image: python:3.7.2


definitions:
    # Build and Push the image to ECR
    - step: &build-push-ecr
        name: Push to ECR
        # a Docker image that can be used as a build environment in Bitbucket Pipelines
        image: atlassian/default-image:2
        services:
          - docker
        script:
          - export BITBUCKET_COMMIT_SHORT="${BITBUCKET_COMMIT::7}"
          - export DOCKER_BUILDKIT=0
          - docker build -t my-app:"${BITBUCKET_COMMIT_SHORT}" .
          - pipe: atlassian/aws-ecr-push-image:2.4.2
            variables:
              AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
              AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
              AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
              IMAGE_NAME: "my-app"
              TAGS: "${BITBUCKET_COMMIT_SHORT}"


pipelines:
  default:
   - step: *build-push-ecr


Explanation of the Pipeline

Docker build step: Builds the Docker image using the docker build command and tags it with a shortened version of the Bitbucket commit hash.

AWS ECR push step: Uses the atlassian/aws-ecr-push-image pipe (version 2.4.2) to push the image to your AWS ECR repository. The necessary AWS credentials and region are passed via environment variables.

Additional Resources

For more details on the atlassian/aws-ecr-push-image pipe, you can check out the official documentation here.

By following these steps, you should be able to push Docker images to AWS ECR from your Bitbucket pipelines quickly and efficiently.


Comments

Popular posts from this blog

Working with Rails Importmap and Sprockets

When starting a new Rails project, you may find yourself juggling different asset management tools. In my recent project, Rails came pre-configured with both: gem "sprockets-rails" gem "importmap-rails" I was keen to use `importmap-rails`, as it offers a modern, gem-free way to manage JavaScript dependencies. However, I was already familiar with `sprockets-rails` from previous projects, which made the mixed setup feel a bit confusing. Since my project uses Bootstrap 5 alongside Turbo and Stimulus, it took some trial and error to get everything working smoothly—especially in production. The Challenge: Importmap in Production According to the Importmap documentation , JavaScript files should be served directly from the `/app/javascript` folder. This works perfectly in development. However, in production, I noticed that the JavaScript files were not being correctly referenced, leading to missing assets and broken functionality. The solution? Precompiling the...

Simple Go HTTP Server Template

When working on various Go projects for clients, one recurring need is setting up a basic HTTP server. Whether it’s for health checks, APIs, or serving simple content, having a lightweight server template is essential. In this post, I’ll share a simple template I use to quickly spin up an HTTP server. This template can be extended to suit any project, and I’ll walk you through the key components. The Main Package In the main.go file, we initialize and start the server. I’ve also integrated graceful shutdown capabilities using the bsm/shutdown package to ensure that the server stops cleanly when interrupted. package main import ( "fmt" "log" "net/http" "../simple-server/internal/server" "github.com/bsm/shutdown" ) func main() { // Start the server fmt.Println("Starting server...") srv, err := server.New() if err != nil { log.Fatalln("failed to start server:", err) } defer func() { err := srv.Stop(...

Using Ruby on Rails and AWS SQS for Website Crawling

Recently, I returned to Ruby on Rails , one of my favourite web application frameworks. This time, I aimed to build a simple management tool for scheduling website crawls—a key component of a side project I'm working on. The tool's purpose is straightforward: Maintain a list of websites to crawl. Schedule crawls for these websites. Trigger a Golang process for the actual crawling task. To facilitate communication between the Rails app and the Golang service, I chose AWS SQS (Simple Queue Service). SQS provides a reliable way to send, receive, and manage messages between distributed systems. Adding an SQS Service in Rails In Rails, services are often used to encapsulate business logic that doesn’t belong in the standard MVC structure. For my application, I created a services/sqs_send_service.rb to handle sending messages to SQS queues. Here’s the implementation:   require "aws-sdk-sqs" class SqsSendService # Client is a class method that ...