terraformer | Executes Terraform configuration as job/pod | Infrastructure Automation library
kandi X-RAY | terraformer Summary
kandi X-RAY | terraformer Summary
Terraformer is a tool that can execute Terraform commands (apply, destroy and validate) and can be run as a Pod inside a Kubernetes cluster. The Terraform configuration and state files (main.tf, variables.tf, terraform.tfvars and terraform.tfstate) are stored as ConfigMaps and Secrets in the Kubernetes cluster and will be retrieved and updated by Terraformer. For talking to the API server, Terraformer will either use the service account of its Pod or the provided kubeconfig (via the --kubeconfig flag). Names of the ConfigMaps and Secrets have to be provided via command line flags. The namespace of the objects can be specified via --namespace or will be defaulted to the Pod’s namespace. For more details and example Kubernetes manifests see the [example] example) directory. Usually, terraformer apply|destroy|validate runs within a single Pod. Please note, that running Terraformer as a Job is not recommended. The Job object will start a new Pod if the first Pod fails or is deleted (for example due to a Node hardware failure or a reboot). Thus, you may end up in a situation with two running Terraformer Pods at the same time which can fail with conflicts.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- NewTerraformerCommand returns a new cobraform command
- fetchObject fetches an object from the state store .
- storeObject is used to store an object in the given directory
- addSubcommand adds a subcommand to the given command
- main is the main function .
- NewTerraformer creates a new Terraformformer .
- NewErrorFormatFuncWithPrefix returns a multierror . ErrorFormatFunc that creates a multierror .
- DefaultPaths returns the default paths for the environment variables .
- LogStateContentsToStdout writes the state file to stdout .
- exampleForCommand returns the example for a command
terraformer Key Features
terraformer Examples and Code Snippets
Community Discussions
Trending Discussions on terraformer
QUESTION
I recreated a state file for an existing infrastructure using a third party tool i.e. terraformer. Now I want o move the .tfstate
to another azurerm back-end and manage from there.
If I just copy the file i.e. mystate.tfstate
from local to inside storage account container with the same file name/key as in the backend configurations will it work or do I need to do something else to achieve it?
I don't want to risk the state file or infrastructure by trying to do something that isnt sure to work as I expect.
...ANSWER
Answered 2022-Feb-09 at 14:32I don't want to risk the state file or infrastructure by trying to do something that isnt sure to work as I expect.
Make a backup of the state file first, and then you won't be risking the state file.
As long as you aren't running an apply
command, you won't be risking the infrastructure. And even if you are running an apply
command, you will be able to review the plan before proceeding.
So just (always) backup your state file, and always review a plan before applying it, and there is no risk.
QUESTION
I'm managing an autoscaling cloud infrastructure in AWS.
Every time I run Terraform it wants to override the desired_count
, i.e. the number of running instances.
I would like this to not happen. How do I do that?
Constraints: I manage multiple different microservices, each of which set up their running instances with a shared module, where desired_count
is specified. I don't want to change the shared module such that desired_count
is ignored for all my microservices. Rather, I want to be able to override or not on a service-by-service (i.e caller-by-caller) basis.
This rules out a straightforward use of lifecycle { ignore_changes = ... }
. As far as I can tell, the list of changes to ignore cannot be given as arguments (my Terraform complains when I try; feel free to tell me how you succeed at this).
My next idea (if possible) is to read the value from the stored state, if present, and ask for a desired_count
equal to its current value, or my chosen initial value if it has no current value. If there are no concurrent Terraform runs (i.e. no races), this should accomplish the same thing. Is this possible?
I'm no expert terraformer. I would appreciate it a lot if you give very detailed answers.
...ANSWER
Answered 2021-Jun-24 at 15:20The lifecycle parameters affect how the graph is built, so they can't be parameterized. The terraform team hasn't ruled out that this could be implemented, but they haven't done it with the issue reported over a couple years.
What you could do is create two aws_ecs_service
resources, and switch between them:
QUESTION
I have a lambda named blog-dev-createArticle
in us-east-1
region. I am trying to use Terraformer to generate its terraform
files.
I am unable to use filters and generate the terraform files for a specific lambda function.
I have tried the following till now but all of them either selects all the lambdas & generates the .tf
files for them or selects no lambda at all.
ANSWER
Answered 2021-Apr-30 at 12:34AWS Lambda uses FunctionName
as attribute for name whereas terraform uses function_name
.
So, using function_name
as attribute in filter did the trick.
QUESTION
I am using terraform
to create a web-acl
in aws
and want to associate that web-acl
with CloudFront distribution.
So, here's how my code looks like:
...ANSWER
Answered 2021-Mar-17 at 22:12When using WAFv2, you need to specify the the ARN not the ID to web_acl_id
in aws_cloudfront_distribution
.
See the note here https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/cloudfront_distribution#web_acl_id
or this GitHub issue https://github.com/hashicorp/terraform-provider-aws/issues/13902
QUESTION
I am new to the world of terraform. I am trying to use terraformer on a GCP project, but keep getting plugin not found:
...ANSWER
Answered 2021-Feb-12 at 15:23The daunting instructions worked!
QUESTION
I'm trying to create a Cloud Logging Sink with Terraform, that contains a regex as part of the filter.
...ANSWER
Answered 2020-Dec-08 at 18:10The problem is not with your query, which is obviously a valid query to search google cloud logging. I think it is due to the fact that you are using another provider (Terraform) to deploy everything. Which will transform your string values and pass them to GCP as a JSON. We ran into a similar issue and it caused me some headaches as well. What we came up with was the following:
QUESTION
Currently trying out a tool called Terraformer (it's a reverse Terraform) - https://github.com/GoogleCloudPlatform/terraformer.
I have a simple GCP project called test-one which only has one resource, vm_instance (google_compute_instance). I ran Terraformer and managed to get the outputs:
...ANSWER
Answered 2020-Aug-25 at 23:05You need to create a terraform module which will deploy whatever enviroment you want and will take as few parameters as possible, only name (e.g: "test-two") if possible.
Converting your current state to use module is not the easiest , but is usually possible without destroying any resource when using terraform import
I would also recommend watching this video
QUESTION
I am currently trying to reverse-terraform existing infrastructure (on aws) using Terraformer. I have managed to import some resources via:
...ANSWER
Answered 2020-Aug-03 at 21:19Have you written Terraform definitions, or are you just using the definitions produced by Terraformer?
If you are just using the Terraformer definitions (which I assume is your situation), you can run
QUESTION
I am trying to import existing AWS infra configuration using google's terraformer and I am unsuccessful due to AWS provider authentication problem. My AWS credentials are MFA enabled and hence i have to use session token. I failed to find options to enable terraformer to use aws session token params.
Here is the debug logs for the terraformer program. Could someone help me with this please. The below is generating empty tf files and states.
...ANSWER
Answered 2020-Jun-02 at 18:54I managed to resolve the problem by explicitily setting the environment variable AWS_SHARED_CREDENTIALS_FILE=~/.aws/credential
Without the above additional env my setup failed.
QUESTION
As per the documentation, with the gcloud
cli if you run gcloud services list --available
you can get a list of the services that are enabled or available to be enabled for a google cloud project. What is the equivalent library/call to use to do this in node? I've taken a look at the libs listed here and can't seem to find how to do this.
I'm using terraformer which is running in a node js env to go and programmatically crawl an account but it will error out if certain services are not enabled for a project when you try and run it. Basically, before I run terraformer
I want to get a list of what services are enabled and only import those services.
ANSWER
Answered 2020-May-15 at 00:59The Google Cloud documentation is quite good and I would recommend a quick Google search in most cases. You can find several examples of what you are looking for here.
The actual http request looks something like the following (this example does not show how to attach authentication information)
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install terraformer
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page