No space on device, system drive showing as 100% used when its only about 50% used

Brief description of the problem

So i was trying to work out how much space i was realistically going to need for my backups (never got to do them before this happened), and so i ran du -hs * in /mnt2 this proceeded to basically crash the system, and upon finally coming back im getting error messages about the system being “out of space”

Run a df -h to see whats going on, because i knew i had around 50GB free space on the system drive/

Filesystem      Size  Used Avail Use% Mounted on
devtmpfs        7.5G     0  7.5G   0% /dev
tmpfs           7.5G     0  7.5G   0% /dev/shm
tmpfs           7.5G  761M  6.7G  10% /run
tmpfs           7.5G     0  7.5G   0% /sys/fs/cgroup
/dev/sde3       104G   53G     0 100% /
tmpfs           7.5G     0  7.5G   0% /tmp
/dev/sde3       104G   53G     0 100% /home
/dev/sde1       477M  199M  250M  45% /boot
tmpfs           1.5G     0  1.5G   0% /run/user/0

As hou can see the issue appears to be here

/dev/sde3 104G 53G 0 100% /

only 53GB used of 104GB but no more space available.

This has now left me with no webgui, non of my pools/shares mounted.

what i am seeing is systemd-journald using a lot of CPU, and if i run journalctrl -f i am getting this repeating at an unreadable rate

Sep 03 03:21:53 hulk kernel: ------------[ cut here ]------------
Sep 03 03:21:53 hulk kernel: WARNING: CPU: 0 PID: 2991 at fs/btrfs/qgroup.c:2955 btrfs_qgroup_free_meta+0xde/0xf0 [btrfs]
Sep 03 03:21:53 hulk kernel: Modules linked in: ebtable_filter ebtables ip6table_filter ip6_tables iptable_filter nct6775 hwmon_vid sunrpc dm_mirror dm_region_hash dm_log dm_mod dax ppdev eeepc_wmi asus_wmi sparse_keymap rfkill edac_mce_amd kvm_amd kvm irqbypass crct10dif_pclmul crc32_pclmul ghash_clmulni_intel pcbc ext4 jbd2 mbcache aesni_intel crypto_simd glue_helper cryptd pcspkr k10temp joydev input_leds shpchp sg sp5100_tco i2c_piix4 snd_hda_codec_realtek snd_hda_codec_generic snd_hda_codec_hdmi parport_pc parport snd_hda_intel snd_hda_codec snd_hda_core snd_hwdep snd_seq snd_seq_device tpm_infineon snd_pcm snd_timer snd soundcore video wmi acpi_cpufreq ip_tables btrfs xor raid6_pq sd_mod crc32c_intel r8169 mii ahci libahci amdkfd libata amd_iommu_v2 radeon i2c_algo_bit drm_kms_helper syscopyarea sysfillrect
Sep 03 03:21:53 hulk kernel:  sysimgblt fb_sys_fops ttm drm uas usb_storage
Sep 03 03:21:53 hulk kernel: CPU: 0 PID: 2991 Comm: in:imjournal Tainted: G        W       4.12.4-1.el7.elrepo.x86_64 #1
Sep 03 03:21:53 hulk kernel: Hardware name: PC SPECIALIST System Product Name/A88XM-PLUS, BIOS 2903 03/10/2016
Sep 03 03:21:53 hulk kernel: task: ffff880409282d80 task.stack: ffffc90002fb0000
Sep 03 03:21:53 hulk kernel: RIP: 0010:btrfs_qgroup_free_meta+0xde/0xf0 [btrfs]
Sep 03 03:21:53 hulk kernel: RSP: 0018:ffffc90002fb3be0 EFLAGS: 00010206
Sep 03 03:21:53 hulk kernel: RAX: 0000000000000102 RBX: ffff880406120000 RCX: 0000000000000000
Sep 03 03:21:53 hulk kernel: RDX: 0000000000000000 RSI: 0000000000014000 RDI: ffff88040725e000
Sep 03 03:21:53 hulk kernel: RBP: ffffc90002fb3c08 R08: 0000000000000000 R09: 0000000000001400
Sep 03 03:21:53 hulk kernel: R10: 00000000ffffffe4 R11: 00000000000002a9 R12: ffff88040725e000
Sep 03 03:21:53 hulk kernel: R13: ffff880406120000 R14: 0000000000014000 R15: 0000000000014000
Sep 03 03:21:53 hulk kernel: FS:  00007f0885419700(0000) GS:ffff88041ec00000(0000) knlGS:0000000000000000
Sep 03 03:21:53 hulk kernel: CS:  0010 DS: 0000 ES: 0000 CR0: 0000000080050033
Sep 03 03:21:53 hulk kernel: CR2: 00007f85a5314000 CR3: 00000003f5dd0000 CR4: 00000000000406f0
Sep 03 03:21:53 hulk kernel: Call Trace:
Sep 03 03:21:53 hulk kernel:  start_transaction+0x378/0x440 [btrfs]
Sep 03 03:21:53 hulk kernel:  btrfs_start_transaction+0x1e/0x20 [btrfs]
Sep 03 03:21:53 hulk kernel:  btrfs_create+0x5a/0x220 [btrfs]
Sep 03 03:21:53 hulk kernel:  path_openat+0xf1f/0x13b0
Sep 03 03:21:53 hulk kernel:  ? unix_dgram_sendmsg+0x2b1/0x690
Sep 03 03:21:53 hulk kernel:  do_filp_open+0x91/0x100
Sep 03 03:21:53 hulk kernel:  ? __alloc_fd+0x46/0x170
Sep 03 03:21:53 hulk kernel:  do_sys_open+0x124/0x210
Sep 03 03:21:53 hulk kernel:  SyS_open+0x1e/0x20
Sep 03 03:21:53 hulk kernel:  entry_SYSCALL_64_fastpath+0x1a/0xa5
Sep 03 03:21:53 hulk kernel: RIP: 0033:0x7f0888cb277d
Sep 03 03:21:53 hulk kernel: RSP: 002b:00007f0885417c30 EFLAGS: 00000293 ORIG_RAX: 0000000000000002
Sep 03 03:21:53 hulk kernel: RAX: ffffffffffffffda RBX: 00007f0878000020 RCX: 00007f0888cb277d
Sep 03 03:21:53 hulk kernel: RDX: 00000000000001b6 RSI: 0000000000000241 RDI: 00007f0885417cd0
Sep 03 03:21:53 hulk kernel: RBP: 00007f0878000020 R08: 00007f0886d8ed8c R09: 0000000000000240
Sep 03 03:21:53 hulk kernel: R10: 0000000000000024 R11: 0000000000000293 R12: 00007f0885418db0
Sep 03 03:21:53 hulk kernel: R13: 00007f0875aa174b R14: 0000000000000020 R15: 000000000000000a
Sep 03 03:21:53 hulk kernel: Code: 48 8b 03 4d 89 f7 49 f7 df 48 8b 7b 08 48 83 c3 18 4c 89 fa 4c 89 e6 ff d0 48 8b 03 48 85 c0 75 e8 49 8b 84 24 38 03 00 00 eb 98 <0f> ff eb 86 0f 0b 66 90 66 2e 0f 1f 84 00 00 00 00 00 0f 1f 44
Sep 03 03:21:53 hulk kernel: ---[ end trace 905ef2e4814bc84a ]---

Any help will be greatly appreciated.

Thanks in advance
Matt

@MattWatson A belated welcome to the Rockstor community.

and:

Unfortunately the df command in linux is yet to become fully btrfs aware. The command you want, to known of a Pool (btrfs volume) usage, is, for the “/” mount point:

btrfs filesystem usage /

this can be shortened to:

btrfs fi us /

That may help to clear this up. Essentially df is not aware of snapshots and the like so can’t properly represent btrfs usage.

To your pool full problem. First config this as the issue via the ‘fi usage’ command and then you need to try and clear some space on the root drive so the system has breathing room. As you suspect it could very well be that systemd is stressing about having no breathing room.

How you clear space depends on what’s going on. You may have used the system drive for a share, i.e. a popular on is the rock-ons-root share. See:

The Rock-ons root

But as btrfs is a copy on write file system and delete counts as a write, you may have to start with small stuff and work you way up to create progressively more breathing space.

The following forum posts may help:

and:

You may well be able to clear some initial space by removing some systemd logs. the --vacumme-time and --vacuum-size options may work for you on this front:

journalctl --vacuum-time=2d

Should clear all but the last 2 days of logs for example.

And if you have used the system drive / pool as a rock-ons-root then there will be lots of docker images and associated snapshots that can also be deleted, assuming you have, as suggested, created dedicated shares (btrfs subvols) for the individual Rockons config and data. If so then deleting the docker associated snapshots will mean you have some additional repair ahead of you in that you will have to force delete the rock-ons database entries before you can re-install them but first things first is to clear some space on the system drive. If the ‘fi usage’ command confirms this to be the cause. But the ‘out of space’ messages are likely just that.

Finally you may want to read this forum thread:

Where there are references to some balance commands that may also help here.

Hope that helps. And if anyone else can jump in here with advise then please do so as I’m currently otherwise engaged unfortunately.

1 Like

Hello @phillxnet

Thank you for your reply, and thank you for the belated welcome.

So i forgot to include the output of the btrfs fi us / command, which is what makes it all the more strange to me.

root@hulk # btrfs filesystem usage /
Overall:
    Device size:		 103.86GiB
    Device allocated:		 103.86GiB
    Device unallocated:		     0.00B
    Device missing:		     0.00B
    Used:			  52.87GiB
    Free (estimated):		  50.86GiB	(min: 50.86GiB)
    Data ratio:			      1.00
    Metadata ratio:		      1.00
    Global reserve:		 129.05MiB	(used: 16.00KiB)

Data,single: Size:102.85GiB, Used:51.99GiB
   /dev/sde3	 102.85GiB

Metadata,single: Size:1.01GiB, Used:902.16MiB
   /dev/sde3	   1.01GiB

System,single: Size:4.00MiB, Used:16.00KiB
   /dev/sde3	   4.00MiB

Unallocated:
   /dev/sde3	     0.00B

Regarding the other threads, I did read throught them, and i did. the balance commands etc but they unfortunatly didnt improve my situation.

this is the output of / subvolumes which seems a little big,

root@hulk # btrfs sub list /
ID 257 gen 63378138 top level 5 path home
ID 258 gen 88827809 top level 5 path root
ID 260 gen 28 top level 258 path var/lib/machines
ID 290 gen 2805446 top level 5 path sonnar_config
ID 324 gen 63378228 top level 5 path media
ID 342 gen 2805436 top level 5 path sabnzbd_config
ID 347 gen 64278172 top level 5 path rockon_root
ID 348 gen 63404526 top level 5 path plex_config
ID 349 gen 63443128 top level 5 path plex_transcode
ID 350 gen 2744984 top level 347 path rockon_root/btrfs/subvolumes/8ce6a918c7ca96d10972ee90b3efbf737e606309c8acad474405acdeed2684b9
ID 351 gen 2744984 top level 347 path rockon_root/btrfs/subvolumes/d0e1ae5628ddd69f059fb2ee25a55f4e9fd30693db97cd1f5db4b04545dff361
ID 352 gen 2744984 top level 347 path rockon_root/btrfs/subvolumes/a785d435d7774efbd9afc418b6703209ad10acb6b16c873aee89dfe8fcc1b810
ID 353 gen 2744985 top level 347 path rockon_root/btrfs/subvolumes/01b10c40a70a75ff0e908d7cfdf6fd42cc2bb0220008ff1ef15105ef7657751e
ID 354 gen 2744985 top level 347 path rockon_root/btrfs/subvolumes/810adee6fdb8ef866d9eab9c76dbb1994d9003be90e305b5dc9fcb2bfde15c64
ID 355 gen 2744985 top level 347 path rockon_root/btrfs/subvolumes/0eb9ddaad13f58b1c0f847374cd7a5448465811f8044e5129b8c86f4f64e6db4
ID 356 gen 2744985 top level 347 path rockon_root/btrfs/subvolumes/ec6189cc8b8451beacbace67c05952781072c3af93cd52179417d9cc5cc62b01
ID 357 gen 2744985 top level 347 path rockon_root/btrfs/subvolumes/0662d6b97d44730f4d509f44dee5971f0e6bb8272324b0397e4ba5198556893f
ID 366 gen 2744986 top level 347 path rockon_root/btrfs/subvolumes/71fb508ee3852fe1731056b7cc60b42dc935b9ce369b3b324fc4eb462240e498
ID 367 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/1cada4a10550a06001d9606010c62858f4ae92fa70e16316cf370dc1d8a2b577
ID 368 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/bcecb0c2a9896ebf95f658c86917868a2f69ca22c9ffb397beff405099c0bca5-init
ID 369 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/bcecb0c2a9896ebf95f658c86917868a2f69ca22c9ffb397beff405099c0bca5
ID 370 gen 63378375 top level 5 path nzbhydra_config
ID 379 gen 2746910 top level 347 path rockon_root/btrfs/subvolumes/adfcfa993b1fcefe8a77b3cee0093e4e43fc28ae53b5b89f1f8a17c8f1a65681
ID 380 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/87290b569c306db8331c317319d31d90659e5a0fda70a1eb017320c8bfffaf7e
ID 381 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/ec6a089b03d77c1a9f21fd0a4c997f38a277de1ddc92f84ffcc5e8b72855531e
ID 382 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/26c6286f59a76aa97ac17e2d2ba6a3ce2fdbd1e25c5a1f11095762b5e5640ac8-init
ID 383 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/26c6286f59a76aa97ac17e2d2ba6a3ce2fdbd1e25c5a1f11095762b5e5640ac8
ID 398 gen 64296631 top level 5 path couchpotato_config
ID 409 gen 2742012 top level 5 path pihole_config
ID 426 gen 63378276 top level 5 path nextcloud_config
ID 507 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/2e02929cec32dfa1456954e74bce8f6b498248f6fd39158624d75b8bd8489e10
ID 508 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/af7a6f33b5da5e1a976cc9a61aeafd7b962feeafccdb0a9d974f7210a901e0bb
ID 509 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/42cbc2094619126ec3baf4e48f2ab27f5466d303831140e0e443d7ac681b9bd3
ID 512 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/d47313e6e77d978a98a8ffbf5ec8243ccb3a8c2fec52819f440f2e934d04af33-init
ID 513 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/d47313e6e77d978a98a8ffbf5ec8243ccb3a8c2fec52819f440f2e934d04af33
ID 514 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/7ccc2e3a57d178273aadff531bf0f3967178aa470c743579cbe41ee265dbb5cf
ID 538 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/61d203fdcaab7f5d6d64c7fbccea94bec8a0c1118b1a67b0bcc70875dd2b4aa4
ID 555 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/85dc1274d156e2a92186b0a17816e8e19c3b276638391c9d0eb0c10a01fea69f
ID 556 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/42e8062be88b17beb66b6dc8187a1c6b94fcba6c7eb951810908e9f6d51b2472
ID 557 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/bd318d13d2ffff0364fe13c870cf141aea009f1656c9c523c6d2da419ffe117a
ID 558 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/cd2df75ece03b7f453390021de8ab1b9e6ed26dfb4b874ad278989638525692d
ID 559 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/c4a23b4a229ff941483dc279ee8de260bfc130388463bca2c47e76d67656731f
ID 560 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/4198c557143a5064b51346351ddf0380125a46b49260426a4f7bba4ece86f13c
ID 561 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/70381d04f39346a0d9cb0fb146dff31e127c7ec52e54e45c950d128d4a8ea039-init
ID 562 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/70381d04f39346a0d9cb0fb146dff31e127c7ec52e54e45c950d128d4a8ea039
ID 571 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/ac0c2ddfcb4ddd90f12b910f22382eb001b7baea6fc3256384cc16f779949126
ID 572 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/12cbcab880869f60ee03f2b3f3ef8d25ad1bf88193566fd4134049c07b58d8f5
ID 573 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/907e610c93c12f4f385bb86c98cc7fb005c9f952ff1be969544f68fb5dde550e
ID 574 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/d99891d62d2df0de8614bcd0ea27a10c1a5016109beb60405264c7eac24782d9
ID 575 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/d8406e07c113f2385919bb4b7a8580f9936a308f4b23802f74bb7c7db7b4e969
ID 576 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/5661a373e16c9eaf8baaec796e46a02b124ced44ef8177a8f76cac3afd022bfb
ID 579 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/00b37e60a7cd2718ccb70dbde24f6d8514d06130fb8c74e9913b8061a298cf20
ID 585 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/99973f74c01612f5e034a1c6cb92096269335dedb32d840aefc1ec62e6a02365
ID 611 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/a72b9a9eccbb18808b95f6443d3c795127fdebec7584b6e7b8a6fceefab65943
ID 616 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/2a7bd9afe86622813273f285459aa655a22ba98643459903b9927199026b9c7b
ID 697 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/e4f796e46932ca894bfa0152b04508c428a8920578b98e65c5455e409e48d881-init
ID 698 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/e4f796e46932ca894bfa0152b04508c428a8920578b98e65c5455e409e48d881
ID 699 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/e3fa8b31429a1e4b147bea954953a3688f44feced0d979ef3db91d3920ce3f2c-init
ID 700 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/e3fa8b31429a1e4b147bea954953a3688f44feced0d979ef3db91d3920ce3f2c
ID 705 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/54d256b013affa316c875c83d94ca8cae35adfc86359f44aa111c0daacd51504
ID 706 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/a6141c1e3c374acd80ba89e08a01e1006b3cac5fd766b3d3839b72baebe8a549
ID 707 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/9feef7149d5bb906f5cd00e366655bd70eeb2257d667e1430ded78bc80aaa4de
ID 708 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/6c53280ad6ad6f5d9f9d0331dfbce82a6a57c63a75b8f133cc10100da0718b90
ID 709 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/8ecfdb646a7502b495e84f60f8015f7275948c9d084729a32e2a60144f85fa85
ID 710 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/4552e823de775c1e86e5d07fbc325f7dc555eb3c322e85c10bca8c27e1aabf46
ID 713 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/6a7df487f71b48f4dc2d7b77c06ebefe517506171f7d0fae730ebaf6abb9536b
ID 714 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/3ec96a79153e4bf236474a7e8adaf13a88208180057b5cfbcecf11983214f3db
ID 715 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/3d83ed44098b9b01170dd52cef86e6d40a0c933201c7f8d8c678dc2f5aab4d60
ID 716 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/4dea014c0547a37c7caca3e13d9fde223b6dbef1bab07fc67de5881241fb02e7
ID 717 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/e6846cf7bb348e9db0fe50a686b2ae9596c99af89e913dcf045fb1839b1d4cea
ID 718 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/8c25fafa2a7b163174ddb7857ab02ad57e9d3613ac44233a69a8850d8fb5c652
ID 719 gen 2805442 top level 347 path rockon_root/btrfs/subvolumes/2e71e27c8f55396a59d14f464b53bc7a8cf295862671cdb892bc3b24195eb99e
ID 720 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/435758ee19fc7983952eedd6d8625a701c7b6988dcbcf0f6b2da1573557d4462
ID 721 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/90b4c5c006d8e928292f84484963d3d53dfc3955976cb7e3bc2cec880c145e8b
ID 722 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/8d212e815547b607bf104f383550c3913fbd6e2de0a96c979ccd1e8625c492a8
ID 723 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/e94b7d95d6b7bb86c661aa9dcdebe0e8e4b9dc5a26f7d15e6caa749d6b3af218
ID 724 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/dd74a92e84099c43e0a95893385f5f13cd23d7f7b0d84308a6ee49e375c2386a
ID 725 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/84b4a5b64416190ced567232c945945bab9a11e17d2adfcccc1d1ca111c889dd
ID 726 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/244bbe8e6603f151e782f29095f482a2707679890152c0500bd40a6cabf58d90
ID 727 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/ec988cc57c90c16a8ce61a9432a6da7ca08683c1c2d136b940537ba96f2a421b
ID 728 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/c922867045d7767923ae4e77ae60ccac7faa106ccf7246f0ba5f4176b2d40255
ID 729 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/d3a374e6a36ec96bc0ffbd1ccdc48dfde5942e6cf7d9dd096ed57cb79062ff14
ID 730 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/10dd4eef60f9afab5db4f98953fdc3a855736a84a274b28a2c501f30a5405832-init
ID 731 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/10dd4eef60f9afab5db4f98953fdc3a855736a84a274b28a2c501f30a5405832
ID 736 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/01aa874b1892b60b584b895e7064b49e01e06e0e6b468c2d0a02eac358a4d224-init
ID 737 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/01aa874b1892b60b584b895e7064b49e01e06e0e6b468c2d0a02eac358a4d224
ID 753 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/828963429d65c1489894ffc89a7e8f67beb67851483ab26f759b84b3bc10cf4c
ID 754 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/bc24c6e06c644acc040f5a95a1491ded469d78df97235edda0ae098caaff4e58
ID 755 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/137cf5c512d82a00f40ca659fb0ac30819e35455b5fb1c362cdad93a78b4077f
ID 756 gen 2805443 top level 347 path rockon_root/btrfs/subvolumes/90d029536e9d76e987034fb5908fc40a8c9d3ea21b37b1b5fcb741482c5f8e2c
ID 757 gen 2805445 top level 347 path rockon_root/btrfs/subvolumes/aecd1a2d2f0fa14f97ed2706f6774d8ee1fba07e560f3b27b100e84a870a3f5b
ID 758 gen 2805445 top level 347 path rockon_root/btrfs/subvolumes/817975d3ebc1e499bdfeab8b08abda21df75440253673db43e18013bc6f915de
ID 759 gen 2805445 top level 347 path rockon_root/btrfs/subvolumes/cabb7580f170a8ad9b4c6ec39c15d320f11c373083589bcb62f44628b1725201
ID 760 gen 2805445 top level 347 path rockon_root/btrfs/subvolumes/5696e6359c6a464deada988d7879e361a1693e8372273f942bc4890e749a61ee
ID 761 gen 2805445 top level 347 path rockon_root/btrfs/subvolumes/beacf399230a742259e63c97a540fd74736ecc6567e3f1ab2fb5c93fff4c5dd2
ID 762 gen 2805445 top level 347 path rockon_root/btrfs/subvolumes/3eb1717f8f077577f4a3f4fb139dffe647fbd5829ac50aefbceff1fcca83aa76
ID 878 gen 2805446 top level 347 path rockon_root/btrfs/subvolumes/2e75308bccfe4d650d95dc154c45b50cbdff2b2d1c4f599eb7fe37a2db02f791
ID 879 gen 2805446 top level 347 path rockon_root/btrfs/subvolumes/f1d4bd54fc372e4811d8ff8955063952ca9e3eaca2981423722a3b889a2b9942
ID 880 gen 2805516 top level 347 path rockon_root/btrfs/subvolumes/8a5567f5acbdcef66cac9dcc2087051893c1c73a3b877065db8ec5c70ab9345f
ID 881 gen 2706213 top level 347 path rockon_root/btrfs/subvolumes/09cd31f4060a90e5c058d933669aec5607a23097b85f7471a98800a3cede02ec
ID 882 gen 2706216 top level 347 path rockon_root/btrfs/subvolumes/3e8a90f66ee6b9fd753014b0f5ca0d3c7b27582563492b63c43895a534a5caf3
ID 883 gen 2783746 top level 347 path rockon_root/btrfs/subvolumes/67b8a3a8edc92064f259fe9d640c2f859d634960a63303d1684c934af03760ce
ID 891 gen 2706236 top level 347 path rockon_root/btrfs/subvolumes/ba9b73fb52ac4a789f57b565b08873bdd01e856b62a0375470606b8c5ef7ff37
ID 892 gen 2706240 top level 347 path rockon_root/btrfs/subvolumes/6d1b5ea356128cacbb6820a7170e0ec5f3e0ae6de82fd29504c4caf0c40e27ed
ID 893 gen 2706242 top level 347 path rockon_root/btrfs/subvolumes/800417e00bfa9292fc316f826fa24f4a9db5568fe40e88074bd16b9af278bdbe
ID 894 gen 2706244 top level 347 path rockon_root/btrfs/subvolumes/491fb9bd3f114b1c0415a3b8fa40c37aa807b600b6a71620f99172fabb71851d
ID 895 gen 2706246 top level 347 path rockon_root/btrfs/subvolumes/aa6d488e84333f266a5ab70dc0f61faae16776e893c9468d12edf5821dbe1ed3
ID 896 gen 2802347 top level 347 path rockon_root/btrfs/subvolumes/4d03cde8dfe8396030f70fd698e7e7db890ad8c9a2e9052c7ec136d17f8b2975

not sure if thats what it should look like or not with all those subvolumes inside rockon_root/btrfs ??

@phillxnet will be the expert on disk usage, but I just wanted to answer that one: yes. The rockon_root share is where all docker images and containers are stored.
http://rockstor.com/docs/docker-based-rock-ons/overview.html#the-rock-ons-root

Docker images are all extracted as different layers and each layer is a Btrfs subvolume (if I’m correct), this is why the number of Btrfs subvolumes in the rockon_root can go up quite quickly as you docker pull a few images. As you’re looking into freeing space, and in case it applies to you, I would look into deleting old or unused images & containers you have at the moment.

As you’re not having a Rockstor webUI at the moment (if I’m correct), you can look at the following:

  • run docker ps -a to list all containers, and docker rm <container-name> for any you’re not using/needing.
  • run docker images and delete all unused/old images using docker rmi <imageID>. Some of these can be quite big so it might help.

Hi @Flox,

Thanks for clarifying my thoughts that this was the docker images.

Now the problem i have is that its not just the webservice that not running its also btrfs, docker, almost any service that requires space on the root system partition.

So unfortunatly without the docker daemon running im not able to issue any docker commands.