Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allocated disk space is not used by container. #2449

Closed
JonathanMontane opened this issue Jan 17, 2018 · 13 comments
Closed

Allocated disk space is not used by container. #2449

JonathanMontane opened this issue Jan 17, 2018 · 13 comments

Comments

@JonathanMontane
Copy link

JonathanMontane commented Jan 17, 2018

Expected behavior

If allocated disk space is 128GB, container should be able to use 128GB (or close to).

Actual behavior

Allocated disk space does not have any impact on container disk space, which is 18GB for some reason on my machine.

Information

  • Issue introduced by 17.12
  • docker is using the Docker.raw format
  • MacOS uses APFS

diagnostic:

Docker for Mac: version: 17.12.0-ce-mac47 (72b93a017350990850ddc37cd341bd16fce3e911)
macOS: version 10.13.2 (build: 17C205)
logs: /tmp/6988D841-12AF-4AFA-8CAF-A19BA3B3B432/20180117-135718.tar.gz
[OK]     db.git
[OK]     vmnetd
[OK]     dns
[OK]     driver.amd64-linux
[OK]     virtualization VT-X
[OK]     app
[OK]     moby
[OK]     system
[OK]     moby-syslog
[OK]     kubernetes
[OK]     env
[OK]     virtualization kern.hv_support
[OK]     slirp
[OK]     osxfs
[OK]     moby-console
[OK]     logs
[OK]     docker-cli
[OK]     menubar
[OK]     disk

commands run:

# on host: Avail: 225GB
df -h | head -n2 | awk '{ print $4}'

# on host: 128GB
ls -lsh ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/Docker.raw | awk '{ print $6 }'

# inside container: Size: 18GB
df -h | head -n 2 | awk '{ print $2 }'

This size does not increase with disk usage, which results in errors like this:

ERROR:  could not extend file "base/16385/59070": No space left on device
HINT:  Check free disk space.

and a final df -h that looks like this:

df -h
Filesystem      Size  Used Avail Use% Mounted on
none             18G   17G     0 100% /

Steps to reproduce the behavior

  1. Increase allocated space to 128GB
  2. attach to a container
  3. check disk space inside container

Note: Since I haven't seen any other report crop up about this issue, I'm guessing that there is a weirder/sneakier underlying issue, and I am at your full disposal to help resolve this issue.

@JonathanMontane
Copy link
Author

Made some progress:

Using screen ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/tty

fdisk -l
Disk /dev/sda: 128 GiB, 137438953472 bytes, 268435456 sectors
Units: sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 4194304 bytes
I/O size (minimum/optimal): 4194304 bytes / 4194304 bytes
Disklabel type: dos
Disk identifier: 0x00000000

Device     Boot   Start      End  Sectors   Size Id Type
/dev/sda1       1975995 40949684 38973690  18.6G 83 Linux
/dev/sda2            63  1975994  1975932 964.8M 82 Linux swap / Solaris

Partition 1 does not start on physical sector boundary.
Partition 2 does not start on physical sector boundary.
Partition table entries are not in disk order.

@akimd
Copy link
Contributor

akimd commented Jan 17, 2018

Hi.

please, don't cut all the output from your commands. For a start, could you run df on the disk image itself (from the host)? In case you were using different partitions.

Something like

$ df -h ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/Docker.raw
Filesystem     Size   Used  Avail Capacity iused               ifree %iused  Mounted on
/dev/disk1s1  466Gi  430Gi   30Gi    94% 3181622 9223372036851594185    0%   /

@JonathanMontane
Copy link
Author

JonathanMontane commented Jan 17, 2018

Sorry. I didn't want to make a massive blurb of irrelevant information. Here are the bits

df -h ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/Docker.raw
Filesystem     Size   Used  Avail Capacity iused               ifree %iused  Mounted on
/dev/disk1s1  466Gi  228Gi  224Gi    51% 1355714 9223372036853420093    0%   /
ls -lsh ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/Docker.raw
36744888 -rw-r--r--  1 montane  staff   128G Jan 17 14:53 /Users/montane/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/Docker.raw

and inside the container:

df -h
Filesystem      Size  Used Avail Use% Mounted on
none             18G   17G     0 100% /
tmpfs            64M     0   64M   0% /dev
tmpfs           7.9G     0  7.9G   0% /sys/fs/cgroup
osxfs           466G  229G  224G  51% /mnt/data
/dev/sda1        18G   17G     0 100% /etc/hosts
shm              64M  4.0K   64M   1% /dev/shm
tmpfs           7.9G     0  7.9G   0% /proc/scsi
tmpfs           7.9G     0  7.9G   0% /sys/firmware

@JonathanMontane
Copy link
Author

JonathanMontane commented Jan 17, 2018

Update: I did a complete reinstall of Docker and now the size available inside my containers is 64GB (instead of expected 128GB), but this is still much better than the 18GB I had pre-nuke.

here's the new df -h from inside the container:

$ df -h
Filesystem      Size  Used Avail Use% Mounted on
overlay          63G  5.7G   54G  10% /
tmpfs            64M     0   64M   0% /dev
tmpfs           7.9G     0  7.9G   0% /sys/fs/cgroup
osxfs           466G  230G  232G  50% /mnt/data
/dev/sda1        63G  5.7G   54G  10% /etc/hosts
shm              64M     0   64M   0% /dev/shm
tmpfs           7.9G     0  7.9G   0% /proc/scsi
tmpfs           7.9G     0  7.9G   0% /sys/firmware

One key difference I noticed is that what used to be none is now overlay. Any clue as to what might have caused the change?

@akimd
Copy link
Contributor

akimd commented Jan 18, 2018

Nope, sorry :/

Have you tried to set it again at 128GB? (or rather, have you checked that the disk is still declared to be 64GB in your settings?)

@akimd
Copy link
Contributor

akimd commented Jan 18, 2018

Actually I can reproduce your problem. Thanks for the report!

If you want to enjoy the disk space, quit Docker, remove the disk image, and rerun Docker, it should do the trick (at the expense of killing all your images and containers) until this issue is fixed.

rm ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/Docker.raw

@gtardif
Copy link
Contributor

gtardif commented Jan 19, 2018

We just released 17.12.0-ce-mac49 on the Stable channel that should fix this. (and 18.01 on the Edge channel also fixes this)

@pgayvallet
Copy link

@JonathanMontane can you confirm this is fixed for you on latest stable or edge ?

@JonathanMontane
Copy link
Author

@pgayvallet. I confirm that it's fixed! Sorry for the late confirmation

@dparkerfmts
Copy link

I'm still having this issue on Docker for Mac: Version 17.12.0-ce-mac49 (21995)
macOs High Sierra v10.13.3

In Container:

Filesystem      Size  Used Avail Use% Mounted on
none             18G   18G     0 100% /
tmpfs            64M     0   64M   0% /dev
tmpfs           2.0G     0  2.0G   0% /sys/fs/cgroup
/dev/sda1        18G   18G     0 100% /etc/hosts
shm              64M     0   64M   0% /dev/shm
tmpfs           2.0G     0  2.0G   0% /proc/scsi
tmpfs           2.0G     0  2.0G   0% /sys/firmware
df -h ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/Docker.raw
Filesystem     Size   Used  Avail Capacity iused               ifree %iused  Mounted on
/dev/disk1s1  932Gi  290Gi  638Gi    32% 1994981 9223372036852780826    0%   /
ls -lsh ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/Docker.raw
2858488 -rw-r--r--@ 1 name  staff    64G 23 Feb 10:27 /Users/davidparker/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/Docker.raw

Restarting Docker did not help.
Only thing that fixed this issue was running

rm ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/Docker.raw

which kinda sucks as all my images/containers are gone now.

@akimd
Copy link
Contributor

akimd commented Mar 5, 2018

Sorry about that. FTR, maybe resizing from the GUI would have addressed the issue.

@rawrgulmuffins
Copy link

rawrgulmuffins commented Feb 15, 2019

I think I'm hitting a similar (or the same) issue on 2.0.3

screen shot 2019-02-14 at 10 44 55 pm

docker volume create --name example
example
docker build --no-cache -f Dockerfile \
            --build-arg SOURCE_IMAGE=python:3 \
            --build-arg UID=1095843692 \
            -t example_pytest .
Sending build context to Docker daemon  84.99kB
Error response from daemon: mkdir /var/lib/docker/tmp/docker-builder553623694: no space left on device
make: *** [env] Error 1
% ls -lah ~/Library/Containers/com.docker.docker/Data/vms/0/Docker.raw
-rw-r--r--@ 1 alexl  575902425    60G Feb 14 22:44 /Users/alexl/Library/Containers/com.docker.docker/Data/vms/0/Docker.raw
alexl@MBP000413 ~/git/example :)% docker image ls
REPOSITORY                                              TAG                 IMAGE ID            CREATED             SIZE
<none>                                                  <none>              b68ba6447529        8 hours ago         929MB
artifacts.dev.versive.com/docker-users-virtual/python   3                   ac069ebfe1e1        2 days ago          927MB
alexl@MBP000413 ~/git/example :)% docker image ls -a
REPOSITORY                                              TAG                 IMAGE ID            CREATED             SIZE
<none>                                                  <none>              58244b31cc25        8 hours ago         929MB
<none>                                                  <none>              b68ba6447529        8 hours ago         929MB
<none>                                                  <none>              0f198fea228d        8 hours ago         929MB
<none>                                                  <none>              69868a8eab93        8 hours ago         927MB
artifacts.dev.versive.com/docker-users-virtual/python   3                   ac069ebfe1e1        2 days ago          927MB
% docker info
Containers: 1
 Running: 0
 Paused: 0
 Stopped: 1
Images: 5
Server Version: 18.09.2
Storage Driver: overlay2
 Backing Filesystem: extfs
 Supports d_type: true
 Native Overlay Diff: true
Logging Driver: json-file
Cgroup Driver: cgroupfs
Plugins:
 Volume: local
 Network: bridge host macvlan null overlay
 Log: awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog
Swarm: inactive
Runtimes: runc
Default Runtime: runc
Init Binary: docker-init
containerd version: 9754871865f7fe2f4e74d43e2fc7ccd237edcbce
runc version: 09c8266bf2fcf9519a651b04ae54c967b9ab86ec
init version: fec3683

@docker-robott
Copy link
Collaborator

Closed issues are locked after 30 days of inactivity.
This helps our team focus on active issues.

If you have found a problem that seems similar to this, please open a new issue.

Send feedback to Docker Community Slack channels #docker-for-mac or #docker-for-windows.
/lifecycle locked

@docker docker locked and limited conversation to collaborators Jun 27, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

9 participants