Has anyone had success using a large Docker base i...
# prefect-community
s
Has anyone had success using a large Docker base image? I'm trying to convert a Selenium script into a Prefect Flow .. for Selenium, it requires a full install of Google Chrome which makes the base image around 700MB with a ton of layers
selenium/standalone-chrome
. But when I try to build through the CLI, it's giving me trouble (essentially spun the CPU fan for an hour, and eventually gave up because
Copy code
Traceback (most recent call last):=============================>]  143.3MB/143.3MB
  File "example-selenium.py", line 473, in <module>
    parameters=dict(
  File "/opt/anaconda3/envs/fastapi-async-sqlalchemy/lib/python3.7/site-packages/prefect/core/flow.py", line 1419, in register
    no_url=no_url,
  File "/opt/anaconda3/envs/fastapi-async-sqlalchemy/lib/python3.7/site-packages/prefect/client/client.py", line 623, in register
    serialized_flow = flow.serialize(build=build)  # type: Any
  File "/opt/anaconda3/envs/fastapi-async-sqlalchemy/lib/python3.7/site-packages/prefect/core/flow.py", line 1228, in serialize
    storage = self.storage.build()  # type: Optional[Storage]
  File "/opt/anaconda3/envs/fastapi-async-sqlalchemy/lib/python3.7/site-packages/prefect/environments/storage/docker.py", line 282, in build
    self._build_image(push=push)
  File "/opt/anaconda3/envs/fastapi-async-sqlalchemy/lib/python3.7/site-packages/prefect/environments/storage/docker.py", line 312, in _build_image
    self.pull_image()
  File "/opt/anaconda3/envs/fastapi-async-sqlalchemy/lib/python3.7/site-packages/prefect/environments/storage/docker.py", line 520, in pull_image
    raise InterruptedError(line.get("error"))
InterruptedError: write /var/lib/docker/tmp/GetImageBlob079036145: no space left on device
c
Hi Scott, usually when I see that error I have to clear my Docker data; for me (on a Mac) I run:
Copy code
rm -rf ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux
or
Copy code
rm -rf ~/Library/Containers/com.docker.docker/Data/*
upvote 1
j
@Scott Zelenka I think what @Chris White mentioned should work. Alternatively, I think the official way to do it is:
Copy code
docker system prune --all
(That will remove all Docker data — images, containers, etc.) We use that when we get “no space left on device.”
s
Tried the prune all approach, but it still churns for a really long time... I'm going to try to clean up the Dockerfile and try with a smaller image (hopefully) When running on the CLI, it seems to download the image again, even though that image is on my local system. Is there so flag to have it check the local repo before pulling?
c
FWIW I believe @Zachary Hughes was running into this the other day and had to take the more aggressive
rm -rf
approach
z
Can confirm that
docker system prune -a
did not do the trick for me. If you go the
rm -rf
route, friendly heads-up that you'll likely need a soft restart of your computer before Docker starts playing nicely for you.
👍 2
s
rm -rf
and restart of
docker
allowed it to proceed
👍 1