Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to remove image, image is in use by container #7889

Closed
kafji opened this issue Oct 2, 2020 · 24 comments
Closed

Failed to remove image, image is in use by container #7889

kafji opened this issue Oct 2, 2020 · 24 comments
Labels
kind/bug Categorizes issue or PR as related to a bug. locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments.

Comments

@kafji
Copy link

kafji commented Oct 2, 2020

/kind bug

Description

Failed to remove image with error: image is in use by container but I have 0 containers running.

$ podman container list --all
CONTAINER ID  IMAGE   COMMAND  CREATED  STATUS  PORTS   NAMES
$ podman images
REPOSITORY                  TAG     IMAGE ID      CREATED        SIZE
<none>                      <none>  92038d6ed63a  9 minutes ago  1.25 GB
<none>                      <none>  2753208588ef  7 hours ago    1.25 GB
docker.io/library/postgres  latest  817f2d3d51ec  7 days ago     322 MB
<none>                      <none>  522d62996757  2 weeks ago    1.19 GB
docker.io/library/rust      latest  4050c19325e5  3 weeks ago    1.19 GB
docker.io/library/redis     latest  84c5f6e03bf0  3 weeks ago    108 MB
$ podman image rm 92038d6ed63a
Error: 1 error occurred:
	* image is in use by a container
$ podman image rm 2753208588ef
Error: 1 error occurred:
	* image is in use by a container
$ podman image rm 522d62996757
Error: 1 error occurred:
	* image is in use by a container

Steps to reproduce the issue:

Sorry, don't have reproduce steps.

Describe the results you received:

Error: 1 error occurred:
	* image is in use by a container

Describe the results you expected:

Image successfully removed.

Additional information you deem important (e.g. issue happens only occasionally):

I don't know what caused this to happened. My guess is because I canceled image build (ctrl+c) and then run podman system prune.

Output of podman version:

Version:      2.1.1
API Version:  2.0.0
Go Version:   go1.15.2
Built:        Thu Jan  1 07:00:00 1970
OS/Arch:      linux/amd64

Output of podman info --debug:

host:
  arch: amd64
  buildahVersion: 1.16.1
  cgroupManager: cgroupfs
  cgroupVersion: v1
  conmon:
    package: 'conmon: /usr/libexec/podman/conmon'
    path: /usr/libexec/podman/conmon
    version: 'conmon version 2.0.20, commit: '
  cpus: 8
  distribution:
    distribution: ubuntu
    version: "20.04"
  eventLogger: journald
  hostname: █████
  idMappings:
    gidmap:
    - container_id: 0
      host_id: 1000
      size: 1
    - container_id: 1
      host_id: 100000
      size: 65536
    uidmap:
    - container_id: 0
      host_id: 1000
      size: 1
    - container_id: 1
      host_id: 100000
      size: 65536
  kernel: 5.4.0-48-generic
  linkmode: dynamic
  memFree: 258670592
  memTotal: 16413179904
  ociRuntime:
    name: runc
    package: 'cri-o-runc: /usr/lib/cri-o-runc/sbin/runc'
    path: /usr/lib/cri-o-runc/sbin/runc
    version: 'runc version spec: 1.0.2-dev'
  os: linux
  remoteSocket:
    path: /run/user/1000/podman/podman.sock
  rootless: true
  slirp4netns:
    executable: /usr/bin/slirp4netns
    package: 'slirp4netns: /usr/bin/slirp4netns'
    version: |-
      slirp4netns version 1.1.4
      commit: unknown
      libslirp: 4.3.1-git
      SLIRP_CONFIG_VERSION_MAX: 3
  swapFree: 4291031040
  swapTotal: 4294963200
  uptime: 4h 7m 7.97s (Approximately 0.17 days)
registries:
  search:
  - docker.io
  - quay.io
store:
  configFile: /home/█████/.config/containers/storage.conf
  containerStore:
    number: 0
    paused: 0
    running: 0
    stopped: 0
  graphDriverName: vfs
  graphOptions: {}
  graphRoot: /home/█████/.local/share/containers/storage
  graphStatus: {}
  imageStore:
    number: 11
  runRoot: /run/user/1000/containers
  volumePath: /home/█████/.local/share/containers/storage/volumes
version:
  APIVersion: 2.0.0
  Built: 0
  BuiltTime: Thu Jan  1 07:00:00 1970
  GitCommit: ""
  GoVersion: go1.15.2
  OsArch: linux/amd64
  Version: 2.1.1

Package info (e.g. output of rpm -q podman or apt list podman):

podman/unknown,now 2.1.1~1 amd64 [installed]
podman/unknown 2.1.1~1 arm64
podman/unknown 2.1.1~1 armhf
podman/unknown 2.1.1~1 s390x

Have you tested with the latest version of Podman and have you checked the Podman Troubleshooting Guide?

Yes

Additional environment details (AWS, VirtualBox, physical, etc.):

Physical machine running Ubuntu 20.04. Podman binary from openSUSE Kubic.

@openshift-ci-robot openshift-ci-robot added the kind/bug Categorizes issue or PR as related to a bug. label Oct 2, 2020
@kafji
Copy link
Author

kafji commented Oct 2, 2020

Is there a way to manually remove these dangling images?

@vrothberg
Copy link
Member

Thanks for reaching out, @kafji!

Can you do a podman ps --all --storage? Maybe there's a container running by Buildah?

@kafji
Copy link
Author

kafji commented Oct 2, 2020

@vrothberg Yep, there are a lot of them.

$ podman ps --all --storage
CONTAINER ID  IMAGE                          COMMAND  CREATED            STATUS   PORTS   NAMES
f85528d77bb1                                 buildah  About an hour ago  storage          92038d6ed63ab59384f1a19adb2dadf05eff4fc6a24cbab6d97c26d1c232be45-working-container-2
b5ae1425321b                                 buildah  About an hour ago  storage          d28fd9a52b4f60838fd4b6dace5db6ac81fbbcff423ba8ce03798348a9c08349-working-container-5
5438f1fed746                                 buildah  About an hour ago  storage          rust-fat-working-container-6
c27d6689708d                                 buildah  About an hour ago  storage          rust-fat-working-container-5
21e70ed6eac3                                 buildah  About an hour ago  storage          d28fd9a52b4f60838fd4b6dace5db6ac81fbbcff423ba8ce03798348a9c08349-working-container-4
70e657a1d319                                 buildah  About an hour ago  storage          rust-fat-working-container-4
bead1a84c44c                                 buildah  About an hour ago  storage          d28fd9a52b4f60838fd4b6dace5db6ac81fbbcff423ba8ce03798348a9c08349-working-container-3
04f8d6d4ef40                                 buildah  About an hour ago  storage          rust-fat-working-container-3
173ba634764d                                 buildah  About an hour ago  storage          d28fd9a52b4f60838fd4b6dace5db6ac81fbbcff423ba8ce03798348a9c08349-working-container-2
f6ebd6dde843                                 buildah  About an hour ago  storage          rust-fat-working-container-2
c4c954b7e481                                 buildah  2 hours ago        storage          92038d6ed63ab59384f1a19adb2dadf05eff4fc6a24cbab6d97c26d1c232be45-working-container-1
508866fa4cce                                 buildah  2 hours ago        storage          d28fd9a52b4f60838fd4b6dace5db6ac81fbbcff423ba8ce03798348a9c08349-working-container-1
9420afda8fe8                                 buildah  2 hours ago        storage          rust-fat-working-container-1
db9dfb2f5a2e                                 buildah  2 hours ago        storage          92038d6ed63ab59384f1a19adb2dadf05eff4fc6a24cbab6d97c26d1c232be45-working-container
ea91769cf3a9                                 buildah  2 hours ago        storage          d28fd9a52b4f60838fd4b6dace5db6ac81fbbcff423ba8ce03798348a9c08349-working-container
e132365a453f                                 buildah  9 hours ago        storage          2753208588efedaf50e3f90f20d8e7aeb313cb674d17db98cd31688e9b410b17-working-container
a66dd16c18b0                                 buildah  9 hours ago        storage          rust-fat-working-container
3fbe741387b3                                 buildah  26 hours ago       storage          522d62996757b4712bb0f93b68ce3611261125392030e30a5c212af49166be22-working-container-3
0fbad5dd91a3  docker.io/library/rust:latest  buildah  26 hours ago       storage          rust-working-container-3
2059f238f9bb                                 buildah  26 hours ago       storage          522d62996757b4712bb0f93b68ce3611261125392030e30a5c212af49166be22-working-container-2
3ed95c33000f  docker.io/library/rust:latest  buildah  26 hours ago       storage          rust-working-container-2
70402559bbc2                                 buildah  27 hours ago       storage          522d62996757b4712bb0f93b68ce3611261125392030e30a5c212af49166be22-working-container-1
193b9014de53  docker.io/library/rust:latest  buildah  27 hours ago       storage          rust-working-container-1
59769476b80d                                 buildah  2 weeks ago        storage          522d62996757b4712bb0f93b68ce3611261125392030e30a5c212af49166be22-working-container
35388b962d67  docker.io/library/rust:latest  buildah  2 weeks ago        storage          rust-working-container

podman rm doesn't work on those. Should I use buildah to remove those?

@vrothberg
Copy link
Member

Thanks for coming back so quickly!

Should I use buildah to remove those?

Yes, could can do that. Doing a podman rmi--force will do that as well.

I am going to close the issue but feel free to continue the conversation.

@kafji
Copy link
Author

kafji commented Oct 2, 2020

Appreciate the help. Thanks @vrothberg.

@FilBot3
Copy link

FilBot3 commented Dec 14, 2020

I also used buildah rm --all to clean up all the left over storage listed in podman ps --all --storage.

@millerthegorilla
Copy link

I have the same issue, but don't have buildah installed. Is there no podman build command to remove intermediate containers left over from a failed build (assuming --force-rm=false) ? Perhaps there should be?

@vrothberg
Copy link
Member

@rhatdan WDYT?

@rhatdan
Copy link
Member

rhatdan commented Jan 6, 2021

podman rm will remove these containers. But we don't have a flag

podman rm --external.
Or
podman containers prune --external

Would be the suggested commands to do this. I prefer the second.

@mheon
Copy link
Member

mheon commented Jan 6, 2021 via email

@vrothberg
Copy link
Member

I have the same issue, but don't have buildah installed. Is there no podman build command to remove intermediate containers left over from a failed build (assuming --force-rm=false) ? Perhaps there should be?

I wonder why Buildah isn't cleaning up intermediate containers in a failed build. @nalind @TomSweeneyRedHat do you know?

@nalind
Copy link
Member

nalind commented Jan 6, 2021

I wonder why Buildah isn't cleaning up intermediate containers in a failed build. @nalind @TomSweeneyRedHat do you know?

The default for --force-rm is false, which I'm guessing is intended to make it easier to iteratively troubleshoot a build when it hits an error partway through.

@rhatdan
Copy link
Member

rhatdan commented Jan 6, 2021

I thought we changed this to deafault to true for podman

 podman build --help | grep force-rm
  podman build --layers --force-rm --tag imageName .
      --force-rm                                     Always remove intermediate containers after a build, even if the build is unsuccessful. (default true)

So this should only happen if the user said to leave them around.

@millerthegorilla
Copy link

man podman build states that the default for --force-rm is false.
Since the command to list the dangling containers is 'podman ps --all --storage' perhaps the command to remove leftovers after a failed build could be 'podman build --rm-storage?

@bridgesense
Copy link

Sorry to be a necro, but... The podman rmi -f says it has deleted the image, but all it has managed to do is suppress the false "container in use" message. Doing a list of containers still shows the dangling container. I ran the podman ps --all --storage and see the dangling processes that were created a few days ago. I have to run buildah rm --all before I successfully run the podman rmi -f command.

So my question is, do we have a comparable podman command that does the same thing as buildah rm --all?

@millerthegorilla
Copy link

millerthegorilla commented Apr 1, 2021

@bridgesense I have used podman image prune --all with success. It left a dangling container that reported it was being used, which I then removed with podman rm --force container_id.

@rhatdan
Copy link
Member

rhatdan commented Apr 2, 2021

The rm --all and prune --all will ONLY remove podman containers, not buildah containers. You can remove the Buildah container if you specify the container id directlry.

@mwoehlke-kitware
Copy link

This bug seems to still exist in 3.3.0?

Interrupting a podman build results in leftovers that can only be removed individually via podman container rm --force. These should be removed by podman container prune, or possibly shouldn't exist in the first place.

@rhatdan
Copy link
Member

rhatdan commented Sep 7, 2021

Please open an new issue.

chmeliik added a commit to containerbuildsystem/ansible-role-osbs-remote-hosts that referenced this issue Sep 7, 2022
CLOUDBLD-10965

In exceptional cases [1], podman builds may leave behind intermediate
buildah containers. These containers are not managed by podman, so our
current pruner job ignores them. This can sometimes prevent the pruner
from removing images as well - it only removes images that do not have
associated containers.

Add a searate job for pruning buildah containers. It is implemented as a
custom script, because the `podman container prune` command does not
support removing buildah containers.

[1]: containers/podman#7889

Signed-off-by: Adam Cmiel <acmiel@redhat.com>
chmeliik added a commit to containerbuildsystem/ansible-role-osbs-remote-hosts that referenced this issue Sep 8, 2022
CLOUDBLD-10965

In exceptional cases [1], podman builds may leave behind intermediate
buildah containers. These containers are not managed by podman, so our
current pruner job ignores them. This can sometimes prevent the pruner
from removing images as well - it only removes images that do not have
associated containers.

Add a searate job for pruning buildah containers. It is implemented as a
custom script, because the `podman container prune` command does not
support removing buildah containers.

[1]: containers/podman#7889

Signed-off-by: Adam Cmiel <acmiel@redhat.com>
@smokeyhallow
Copy link

I am still getting this? Any pointers?

@rhatdan
Copy link
Member

rhatdan commented Jan 29, 2023

Does podman rmi --force work?

@rhatdan
Copy link
Member

rhatdan commented Jan 29, 2023

You could check to see if there are any buildah containers around by installing buildah.

@vrothberg
Copy link
Member

I am still getting this? Any pointers?

Please open a new issue with a reproducer.

@successtheman
Copy link

successtheman commented Mar 17, 2023

I also used buildah rm --all to clean up all the left over storage listed in podman ps --all --storage.

This worked perfectly

pyamsoft added a commit to pyamsoft/mousewatch that referenced this issue Jun 27, 2023
pyamsoft added a commit to pyamsoft/stonk that referenced this issue Jun 27, 2023
pyamsoft added a commit to pyamsoft/ride-roulette that referenced this issue Jun 27, 2023
@github-actions github-actions bot added the locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments. label Aug 29, 2023
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Aug 29, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
kind/bug Categorizes issue or PR as related to a bug. locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments.
Projects
None yet
Development

No branches or pull requests