Skip to content

Bootstrap the environment

This guide is outdated

The guide is outdated due to us moving from PoC to using micro stacks. The new documentation is not ready yet.

If you are setting up something new, please contact team Kjøremiljø first so we can do this together ❤ For more up to date information about how to set up your infrastructure you can take sneak peek at our reference app pirates-iac.

This guide can still be used for trouble shooting purposes for teams that have not yet taken the micro stack structure in use.

Now that we have created the environment definition file, we can bootstrap the environment.

Terraform state is a file that contains information about the resources that Terraform has created. By default, this file is stored locally on the machine it is run from. The bootstrap command will create the necessary S3 bucket and DynamoDB table that will instead be used to store Terraform state remotely.

After these steps, you will have a foundation for all the other stacks you create inside this environment using the ok tool.

Run bootstrap command:

ok bootstrap

Now we need to tell Terraform to initialize this stack and apply it to create the S3 bucket and DynamoDB table:


If you haven't already added a SSH key to your GitHub account (or have gh(GitHub CLI) configured). See GitHub SSH key guide for more information. This is required for Terraform to be able to fetch the modules from the golden-path-iac repository.

cd remote_state/
terraform init
terraform apply

Verify that the S3 bucket for storing remote Terraform state was created:

aws s3 ls

You should see a bucket (folder) with a name in the following format:


For example:

2021-09-01 10:00:00 ok-iac-config-12345678910-eu-west-1-my-team-dev

Move your current local Terraform state to the S3 backend

You now have a place to store Terraform state for all your future stacks. However, the stack you're currently working on is already using a local state file (terraform.tfstate).

The next step is to transfer this local state file to the S3 bucket you just set up.

To be able to transfer the state we need to delete the file named

Now you need to tell Terraform where the remote state files should be placed. Navigate back to the environment directory and run ok scaffold in order to reconfigure the remote state configuration:

cd ..
ok scaffold remote_state

Go back into the remote_state directory to initialize the new configuration:

cd remote_state/
terraform init

You will see the following message:

Do you want to copy existing state to the new backend?
  Pre-existing state was found while migrating the previous "local" backend to the
  newly configured "s3" backend. No existing state was found in the newly
  configured "s3" backend. Do you want to copy this state to the new "s3"
  backend? Enter "yes" to copy and "no" to start with an empty state.

  Enter a value:

Type yes and press Enter.

Verify that everything is working by running terraform plan:

terraform plan

You can now delete terraform.tfstate since it's stored in the S3 bucket instead. This is done so that the next time Terraform is run, it's updating the correct file instead of adding a new local file.

rm terraform.tfstate

Committing your files

At this stage it might be a good idea to commit your files.