I have added code to a repo to build a Cloud Run service. The structure is like this:
I want to run b.py
in cr
.
Is there any way I can deploy cr
without just copying b.py
into the cr
directory? (I don't want to do it since here are lots of other folders and files that b.py
represents).
The problem is due to the Dockerfile being unable to see folders above.
Also how would eg api.py
import from b.py
?
TIA you lovely people.
You have to build your container with the correct parameters, and therefore, not to use the gcloud run deploy --source=. ....
to build your container with default parameters
With docker, the Dockerfile
by default is in the PATH/Dokerfile. But you can override that default behavior with the -f
parameter to indicate the Dockerfile
location.
For example, you can do that
cd topdocker build -f ./a/cr/Dockerfile .
Like that, you provide to the docker build runtime the current path (here top
, and the current path is represented but the dot at the end .
).
And you also specify the full path of the Dockerfile
inside this current path.
So that, you have to update your Dockerfile
, because the COPY . .
will no longer copy the cr
path, but the whole top
directory.
EDIT 1
To validate my answer, I exactly do what you ask in your comment. I used gcloud build summit
but:
- I ran the command from the
top
directory
- I created a
cloudbuild.yaml
file
steps:- name: 'gcr.io/cloud-builders/docker'entrypoint: 'bash'args:- -c- |docker build -f ./a/cr/Dockerfile -t <YOUR TAG> .docker push <YOUR TAG>
you can't perform a gcloud builds submit --tag <YOUR TAG>
from the top
directory if you haven't a Dockerfile
in the root dir.