Hi I am using DockerRun for my Flow's environment ...
# ask-community
n
Hi I am using DockerRun for my Flow's environment and I am running into some issues. The Docker image pulls and runs fine, but it uses pytorch and I am starting to run into some shared memory issues for my dataloader. Normally using Docker, people have solved this by passing arguments to increase the shared memory size to the docker run command. Is there any way to change the shared memory of the DockerRun on Prefect?
It seems like this should maybe be an option for the run config but isn't.
k
Hey @Nelson Griffiths, was looking at some stuff. So the Docker Agent uses
create_container
from the Docker SDK. I’m not seeing an option here immediately. How would you do it with the
docker run
command?
n
Copy code
docker run -it --shm-size=256m
This would change it from 64mb to 256mb
Its strange the sdk wouldn't support it. Here it is in the Docker documentation https://docs.docker.com/engine/reference/run/#runtime-constraints-on-resources
k
Yeah. Here is the right link. Not seeing anything. Also saw this with attaching gpus where it’s not available in the SDK
I know it’s likely not what you are looking for, but do you think container limits on the build side can help you?
n
I have already tried that. It didn't do anything
It seems like the sdk supports it here https://docker-py.readthedocs.io/en/stable/containers.html
k
Oh it’s under the
create_host_config
. Then that needs to go into
create_container
. I’ll look into this more.
👍 1
shmsize
is also exposed on the build side. Did you see that?
n
It is exposed on the build side only to provide shared memory to intermediate layers. It returns to the default once the container is running
k
Gotcha. Thanks for clarifying. Ok will see how we can get this in
n
That would be awesome. Thank you!
Im happy to work on any PRs that would help here too.
k
We have the host_config exposed so I think it’s doable. Just need to figure out the syntax.
n
Oh great. Thank you!
k
Actually it doesn’t look at that
shm
is configurable. I think we may need to open an issue
n
Okay. Like I said, happy to help if needed
k
I can write it then tag you in a bit
n
Awesome. Thank you so much!
k
n
Great. I will leave a comment on it. Thank you!