Managing Terraform infrastructure with AWS CodePipeline pipeline
Introduction
This post describes the process of using an AWS-hosted CloudFormation template to provision infrastructure with Terraform through CodePipeline pipeline. AWS Console provides a ready-to-use option for building pipelines. My post, which covers managing CloudFormation nested stacks with CodePipeline pipeline, is available here.
This post focuses on provisioning a basic test infrastructure — an S3 bucket and an S3 bucket policy — using two Terraform code structures: modular and simple. The pipeline can be created directly from the Console by navigating to Developer Tools → CodePipeline → Pipelines → Create pipeline → selecting “Terraform Deploy To AWS” → specifying the source and connection → creating the pipeline based on CloudFormation template. The template code can be copied for customization and manual deployment as a CloudFormation stack. The setup is designed to use one pipeline per environment.
Project Structure
The CloudFormation template provisions the following resources:
- A CodePipeline pipeline with Source and Deploy stages.
- An S3 bucket for storing artifacts and Terraform backend state.
- An IAM role with policies required for resource provisioning and remote Git repository connectivity.
- A CodeConnection resource for Git-based source control.
The Deploy stage uses the Compute category to execute shell commands. The commands are dynamically configured to define:
- The Terraform working directory.
- Terraform backend parameters based on environment variables.
The project structure:
├── codepipeline
│ └── codepipeline_pipeline.yaml
├── infrastructure
│ ├── modular-infra
│ │ └── development
│ │ ├── backend.tf
│ │ ├── main.tf
│ │ ├── modules
│ │ │ ├── s3_bucket
│ │ │ │ ├── main.tf
│ │ │ │ ├── outputs.tf
│ │ │ │ └── variables.tf
│ │ │ └── s3_bucket_policy
│ │ │ ├── main.tf
│ │ │ └── variables.tf
│ │ ├── outputs.tf
│ │ └── variables.tf
│ └── simple-infra
│ └── development
│ ├── backend.tf
│ ├── main.tf
│ └── variables.tf
Improved pipeline configuration in the CloudFormation template:
Resources:
CodePipeline:
Type: AWS::CodePipeline::Pipeline
Properties:
ArtifactStore:
Location: !Ref CodePipelineArtifactsBucket
Type: S3
ExecutionMode: QUEUED
Name: !Ref CodePipelineName
PipelineType: V2
RoleArn: !If
- CreatePipelineRole
- !GetAtt CodePipelineRole.Arn
- !Ref PipelineRoleArn
Stages:
- Name: Source
Actions:
- Name: CodeConnections
ActionTypeId:
Category: Source
Owner: AWS
Provider: CodeStarSourceConnection
Version: '1'
Configuration:
ConnectionArn: !GetAtt GitLabConnection.ConnectionArn
FullRepositoryId: !Ref FullRepositoryId
BranchName: !Ref BranchName
OutputArtifacts:
- Name: SourceOutput
RunOrder: 1
OnFailure:
Result: RETRY
- Name: Deploy
Actions:
- Name: Terraform
ActionTypeId:
Category: Compute
Owner: AWS
Provider: Commands
Version: '1'
Commands:
- export release=AmazonLinux
- dnf install -y dnf-plugins-core
- dnf config-manager --add-repo https://rpm.releases.hashicorp.com/$release/hashicorp.repo
- dnf install -y terraform
- !Sub 'export TF_BACKEND_BUCKET=${CodePipelineArtifactsBucket}'
- !Sub 'export AWS_REGION=${AWS::Region}'
- cd $CODEBUILD_SRC_DIR
- !Sub 'cd infrastructure/${InfrastructureFolder}/${Environment}'
- !Sub 'terraform init -input=false -backend-config="bucket=$TF_BACKEND_BUCKET" -backend-config="key=backend/${InfrastructureFolder}/${Environment}/terraform.tfstate" -backend-config="region=$AWS_REGION"'
- terraform plan -input=false
- terraform apply -auto-approve -input=false
InputArtifacts:
- Name: SourceOutput
RunOrder: 1
Prerequisites
Ensure the following prerequisites are met:
- An AWS account with necessary permissions.
- AWS CLI installed locally.
- A remote Git repository containing Terraform configuration.
Deployment
1. Clone the repository locally and push it to the remote Git repository.
git clone https://gitlab.com/Andr1500/terraform-from-pipeline.git
2. Fill in all necessary Parameters in the CloudFormation template, and create the CloudFormation stack.
aws cloudformation create-stack \
--stack-name pipeline-terraform-deployment \
--template-body file://codepipeline/codepipeline_pipeline.yaml \
--capabilities CAPABILITY_NAMED_IAM --disable-rollback
3. Authorise the CodeConnection.
Open the AWS Console, next go: CodePipeline -> Settings -> Connections -> choose the created connection in pending status -> Update pending connection -> depends on your provider make authorisation, and grant necessary access to the repository.
4. Push Terraform resources update.
Pushing changes to the remote repository will trigger the pipeline automatically. The pipeline initializes the backend, plan, and applies the Terraform configuration. No changes are applied if the configuration hasn’t changed, due to backend state tracking.
5. Delete Terraform resources (Optional).
CodePipeline does not delete resources automatically. Deletion can be handled by:
- Removing resources from Terraform files and pushing the changes, or
- Replacing
terraform apply
withterraform destroy -auto-approve
in the pipeline's Deploy stage.
6. Cleanup (Optional).
To delete the artifacts and the CloudFormation stack:
aws s3 rm s3://<pipeline-bucket-name> --recursive
aws cloudformation delete-stack \
--stack-name pipeline-terraform-deployment
Conclusion
AWS-hosted CloudFormation templates allow fast setup of CodePipeline pipelines through AWS Console. They are useful for quick tests and demos. However, this setup is not good for production deployments:
- CodeConnection must be pre-configured.
- Assumes Terraform code is in a fixed folder structure.
- No native support for Terraform backends beyond dynamic configuration.
- No built-in delete mechanism — should be handled manually.
- IAM role permissions must be granted explicitly and carefully.