Passing Environment Variables Not Working with Docker

Passing environment variables not working with Docker

OK, I got it. Instead of the following,

docker run -d enviro -e USERNAME='david'

it must be like this

docker run -d -e USERNAME='david' enviro 

No idea, why docker requires the environment variable before the image's name though.

Environment variable to Docker container not works

The -e option in docker run commands is for passing environment variables as -e MYVAR1 or --env MYVAR2=foo. If you need to pass a file which has the environment variables stored in them, create a file (say "env.list") with contents like this:

Variable1=Value1

Variable2=Value2

Post which you can run docker run --env-file env.list which will set all the environment variables mentioned in the file.

If your original intention was to authenticate to your Google Cloud Project using the JSON, you should do so using gcloud auth:

gcloud auth activate-service-account [ACCOUNT] --key-file=KEY_FILE --project=PROJECT_NAME

How do I pass environment variables to Docker containers?

You can pass environment variables to your containers with the -e flag.

An example from a startup script:

sudo docker run -d -t -i -e REDIS_NAMESPACE='staging' \ 
-e POSTGRES_ENV_POSTGRES_PASSWORD='foo' \
-e POSTGRES_ENV_POSTGRES_USER='bar' \
-e POSTGRES_ENV_DB_NAME='mysite_staging' \
-e POSTGRES_PORT_5432_TCP_ADDR='docker-db-1.hidden.us-east-1.rds.amazonaws.com' \
-e SITE_URL='staging.mysite.com' \
-p 80:80 \
--link redis:redis \
--name container_name dockerhub_id/image_name

Or, if you don't want to have the value on the command-line where it will be displayed by ps, etc., -e can pull in the value from the current environment if you just give it without the =:

sudo PASSWORD='foo' docker run  [...] -e PASSWORD [...]

If you have many environment variables and especially if they're meant to be secret, you can use an env-file:

$ docker run --env-file ./env.list ubuntu bash

The --env-file flag takes a filename as an argument and expects each line to be in the VAR=VAL format, mimicking the argument passed to --env. Comment lines need only be prefixed with #

Docker environment variable not making it through on run command to shell script

There are three ways to pass environment variables to docker



First way

Using -e flag like -e ENV_NAME='ENV_VALUE'

Example with one environment variable

docker run --name some-mysql -e MYSQL_ROOT_PASSWORD='secret' -d mysql:tag

Example with two environment variables

docker run --name some-mysql -e MYSQL_ROOT_PASSWORD='secret' -e MYSQL_DATABASE='mySchema' -d mysql:tag

Example with two environment variables and many options

docker run --name some-mysql -d -t -i -e MYSQL_ROOT_PASSWORD='secret' -e MYSQL_DATABASE='mySchema' mysql:tag

NOTE: You should pass image name mysql:tag after options like -e MYSQL_ROOT_PASSWORD='secret' -e MYSQL_DATABASE='mySchema'



Second way

Using .env file. basicly you will add environment variables to .env file then pass this name to docker run command like docker run --env-file ./.env

Example with one environment variable

Create .env file

MYSQL_ROOT_PASSWORD=secret

Then use it in docker command

docker run --name some-mysql --env-file ./.env -d mysql:tag

Example with two environment variables

Create .env file

MYSQL_ROOT_PASSWORD=secret
MYSQL_DATABASE=mySchema

Then use it in docker command

docker run --name some-mysql --env-file ./.env -d mysql:tag

Example with two environment variables and many options
Create .env file

MYSQL_ROOT_PASSWORD=secret
MYSQL_DATABASE=mySchema

Then use it in docker command

docker run --name some-mysql -d -t -i --env-file ./.env mysql:tag

NOTE: You shouldn't add single quote or double quote to the value

NOTE: You should pass image name mysql:tag after options like --env-file ./.env



Third way

Using linux environment variables so first we need to explain how to add linux environment variables. there are two type of it (local, global). for example -e ENV_NAME.

  • Local (per terminal)

To add local environment variables just use $ export MY_NAME='ahmed'. then try to retrive it $ printenv MY_NAME the result will be ahmed.

NOTE: When you use $ export MY_NAME='ahmed' you can use MY_NAME in any command in current terminal. so if you try to use it in anther terminal it will not work.

  • Local (per command)

To add environment variables to work in current command only just use $ MY_NAME='ahmed' my_command. for example $ MY_NAME='ahmed' printenv MY_NAME the result will be ahmed. so if you try to print MY_NAME again it will not work.

  • Global (for all terminals)

To add environment variables to work in all terminals just open ~/.bashrc then add your environment variables like

MY_NAME='ahmed'
ENV_NAME='ENV_VALUE'

Then try to print it using printenv MY_NAME the result will be ahmed.

Let's follow the examples.

Example with one environment variable

export MYSQL_ROOT_PASSWORD='secret'
docker run --name some-mysql -e MYSQL_ROOT_PASSWORD -d mysql:tag

Example with two environment variables

export MYSQL_ROOT_PASSWORD='secret'
export MYSQL_DATABASE='mySchema'
docker run --name some-mysql -e MYSQL_ROOT_PASSWORD -e MYSQL_DATABASE -d mysql:tag

Example with two environment variables and many options

export MYSQL_ROOT_PASSWORD='secret'
export MYSQL_DATABASE='mySchema'
docker run --name some-mysql -e MYSQL_ROOT_PASSWORD -e MYSQL_DATABASE -d -t -i mysql:tag

NOTE: You should pass image name mysql:tag after options like -e MYSQL_ROOT_PASSWORD -e MYSQL_DATABASE.



Demo

Dockerfile

FROM debian

ENTRYPOINT ["printenv", "ENV_NAME"]

Try to use it

$ docker build --tag demo .

$ ENV_NAME='Hello World' docker run -e ENV_NAME demo:latest
$ Hello World

$ docker run -e ENV_NAME='Hello World' demo:latest
$ Hello World

Passing environment variable to container is not working

There are a couple of layers of things here:

  1. Your local shell expands the command;
  2. Which launches some Docker container;
  3. Which runs some other process (but not necessarily a shell).

In your first example

docker run ... echo $DB_HOST

your local shell catches the variable reference before it ever gets passed on to Docker.

If you explicitly single-quote it

docker run ... echo '$DB_HOST'

Docker will find /bin/echo (assuming it exists in the container) and launch that with the string $DB_HOST as an argument, but again, since no shell is involved on the Docker side, it dutifully prints out that string as is.

The immediate answer to your question is to force there to be a shell on the Docker side

docker run -e DB_HOST=thehost --rm my_application \
sh -c 'echo $DB_HOST'

At a slightly higher level:

  • If you're running a program in some other language and not just a shell command, they'll see the environment normally (Python's os.environ, Ruby's ENV, Node's process.env, etc.)
  • If you have anything even a little complex, writing it into a shell script, COPYing that into an image, and running that is probably more maintainable, and implicitly involves a shell (the first line will say #!/bin/sh)
  • In your Dockerfile, if you say CMD some command, Docker will automatically wrap that in a shell; that is equivalent to CMD ["sh", "-c", "some command"]
  • The same is true for ENTRYPOINT, but it is probably a bug to use it that way

Pass environment variable from host to DockerOperator in Airflow

First step is checking if you have the env variable exists in your host:

echo $GOOGLE_ADS_TOKEN

Second step is adding the env variable to all the scheduler and all the workers containers, to do that you need to update the docker-compose file

version: '3'
x-airflow-common:
&airflow-common
# In order to add custom dependencies or upgrade provider packages you can use your extended image.
# Comment the image line, place your Dockerfile in the directory where you placed the docker-compose.yaml
# and uncomment the "build" line below, Then run `docker-compose build` to build the images.
image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.3.3}
# build: .
environment:
&airflow-common-env
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
...
_PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
GOOGLE_ADS_TOKEN: ${GOOGLE_ADS_TOKEN}

Last step is checking if the env variable exists in the worker container:

docker-compose exec airflow-worker bash -c 'echo "$GOOGLE_ADS_TOKEN"'

Docker-compose not passing environment variables to docker container

Try to follow the docs:

Compose supports declaring default environment variables in an
environment file named .env placed in the folder where the
docker-compose command is executed (current working directory)

Try to use ENTRYPOINT python /usr/src/app/myblumbot/main.py instead of RUN...

Docker compose won't use .env variables [NodeJS, Docker]

Your environment variable file should be called .env and you use the variables using the syntax ${password}.

The way you've done it - with the env_file statement - is used when you want the variables set inside the container. You can't use them in the docker-compose file, when you do it that way.

You can remove the env_file statements in the docker-compose file unless you need the variables in there as well. But it looks like you pass them all in using the docker-compose file.



Related Topics



Leave a reply



Submit