Scaling up to 50 Docker containers

Anudha Mittal
3 min readOct 24, 2024

--

$ docker ps shows >50 containers concurrently running. Launched as background processes with:

docker run -d — name $CONTAINER_NAME -p $((8080 + i)):80 -e PYTHON_PORT=$PYTHON_PORT my-nginx-python &

The ‘&’ sign at the end of the command indicates running the command as background process. The status dashboard shows that 50 containers were created 11 minutes ago, the ‘Up’ time shows 6 minutes ago for a few of these containers. That’s a gap of 5 minutes from when the command was given and the container was launched. Indicates system performance limit.

Approximately 40 containers launched within the minute the command was given. Since it’s not clear right now, what stalled, removed the ‘&’.

Find a method to diagnose this.

Check performance with GNU parallel for launching with > 50 containers or use series launching, because at least the current state is clear with the current naive method to monitor the launch: watching the terminal print out what container was launched.

docker: Error response from daemon: failed to create task for container: DeadlineExceeded: failed to start shim: start failed: failed to create TTRPC connection: dial unix /run/containerd/s/1aee2277fa5f3694422b3b55afe430a64517f1bdea84297a2ff24e38dbd547e6: i/o timeout: context deadline exceeded.

free -h while running 50 containers:

stopped 50 containers with this command:

docker ps -q | head -n 50 | xargs docker stop

subsequently, docker ps shows no active containers. Nothing else was changed since the last screenshot of memory use, so the

  • used memory dropped from 850M to 466M. That’s 384M drop in used memory. The available memory increases from 101M to 486M → 385M. Are these 2 columns separate measurements or calculated from the same measurement?
  • why did cached memory increase?

free -h

Adding AI/ML backend to web applications served via docker containers

It would be best to install all necessary python libs using a requirements.txt file and specifying the version of each lib when building the image. This means use a RUN command (not CMD) to install python packages with pip or another package manager.

requirements.txt

requests==2.25.1
numpy==1.21.0

Test installing packages via dockerfile with a faster python package manager.

In terms of reproducibly building an AI-ML based web application, there is now a 1) dockerfile 2) a web server config file 3) a text file with a list of python packages 4) custom code — and optional content like images for a site that may not be a large enough set to be accessed via a database query.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

No responses yet

Write a response