05/05/2019

Import existing infrastructure AWS, GCP, and others - Terraform

Automation and infrastructure as code became a really important thing in tech industry. Actually the became a de facto!

But in real world and especially in startups world when you have limited resources, money, and time it's almost impossible to automate everything from day zero! So at first many stuff are done manually then later with time they are automated.

So now you have some infrastructure which have done manually (e.g. DNS records) and now you want to start automating them, what to do? You need to decide which tool you are going to use, then get what you already have (what's done manually) so you have control over all data that created and will be created.

In best case scenario there should be an easy way to export what you already have and use that with your automation tool. Again, in real world it's not always like that even with popular automation tools like Terraform. So you need to do some extra work here.

At the time of writing this post, Terraform (0.12) can only import resources into the state. It does not generate TF files. Here comes the role of helper tools like Terraforming which exports existing AWS resources to Terraform style (tf, tfstate).

Also Terraform can import one resource at a time. And some of resources have a bit complex structure and could have data in external place. (like EC2 machine ID which )

TL;DR

To import existing infrastructure to Terraform, simply you need to:

  • A tool to export and generate TF files or you write TF file yourself.
  • Another tool to import to your tfstate or generate TF import IDs.

AWS

In case the infrastructure is on AWS the mission is pretty easy! Terraforming can do both, it generates the the TF files and also import what's generated into the tfstate file.

CGP

Also in case the infrastructure is on GCP the mission is easy too! Terraformer is there for GCP. The nice thing it's actually under GCP projects, but still in beta.

Others

For infrastructure that's not on AWS or GCP, it's a bit harder. You need first to write the TF files (find a way to generate them). Also you need to import them using Terraform import command. (not all plugins support import BTW).

Actually it's not that easy/straightforward to get import IDs for some resources because some data are on external sources like an API or so.

So as an example, I created a Python script generates TF import IDs for AWS Route53. Which actually no need for it with Terraforming, but it could be helpful as an example for other platforms.

Also even for those tools that generating TF files like AWS Terraforming, they generate a big fat file with all HCL. You maybe like to split that file to smaller files (e.g. TF file per DNS zone). pyhcl helps with that to read the TF files and manipulate them as you like. (pyhcl is used in the script mentioned above.

That's it :-)

Powered by Blogger.

Hello, my name is Ahmed AbouZaid, I'm a passionate Tech Lead DevOps Engineer. 👋

With 16+ years of open-source contributions, 12+ years of professional hands-on experience in DevOps, and an M.Sc. in Data Engineering from Edinburgh Napier University (UK), I enjoy facilitating the growth of both businesses and individuals.

I specialize in Cloud-Native and Kubernetes. I'm also a Free/Open source geek and book author. My favorite topics are DevOps transformation, automation, data, and metrics.

Contact Me

Name

Email *

Message *

Start Your DevOps Engineer Journey!

Start Your DevOps Engineer Journey!
Start your DevOps career for free the Agile way in 2024 with the Dynamic DevOps Roadmap ⭐

Latest Post

2023 Highlights

Image generated with Craiyon . Finally, 2023 is over! What a year! One more crazy year, but it was the culmination ...

Popular Posts

Blog Archive