Rock-on stuck uninstalling

I’m trying to uninstall Emby (in order to update to the latest release) but the uninstall is stuck on “uninstalling” Eventually it did provide this error:

       >  Traceback (most recent call last):

File “/opt/rockstor/src/rockstor/storageadmin/views/rockon.py”, line 139, in post
self._create_update_meta(r, rockons[r])
File “/opt/rockstor/.venv/lib/python2.7/site-packages/django/utils/decorators.py”, line 185, in inner
return func(*args, **kwargs)
File “/opt/rockstor/src/rockstor/storageadmin/views/rockon.py”, line 317, in _create_update_meta
handle_exception(Exception(e_msg), self.request)
File “/opt/rockstor/src/rockstor/storageadmin/util.py”, line 48, in handle_exception
status_code=status_code, detail=e_msg, trace=traceback.format_exc()
RockStorAPIException: [‘Cannot add/remove volume definitions of the container (embyserver) as it belongs to an installed Rock-on (Emby server). Uninstall it first and try again.’, ‘Traceback (most recent call last):\n File “/opt/rockstor/.venv/lib/python2.7/site-packages/gunicorn/workers/sync.py”, line 68, in run_for_one\n self.accept(listener)\n File “/opt/rockstor/.venv/lib/python2.7/site-packages/gunicorn/workers/sync.py”, line 27, in accept\n client, addr = listener.accept()\n File “/usr/lib64/python2.7/socket.py”, line 206, in accept\n sock, addr = self._sock.accept()\nerror: [Errno 11] Resource temporarily unavailable\n’]

I tried running /opt/rockstor/.venv/bin/delete-rockon embyserver but that returns “No Rock-On named embyserver was found in the database”. (Just in case, I also tried it with emby, Emby, “Emby Server” et al, all with the same result).

The uninstallng state remained after stopping/starting the Rockon service as well as a system reboot.

I’m current on the stable updates channel: Rockstor 4.6.1-0 and Leap 15.4.

Any thoughts or suggestions?

Thank you in advance.

Some follow up:

I thnk this is at the heart of the problem:

[10/Sep/2023 18:26:41] ERROR [system.osi:244] non-zero code(1) returned by command: ['/usr/bin/docker', 'rm', 'embyserver']. output: [''] error: ['Error response from daemon: container 141a02b37e4b142beea55d04e1896de8797f64bae10306de54383bc320ac944d: driver "btrfs" failed to remove root filesystem: Failed to destroy btrfs snapshot /mnt2/Rockons/btrfs/subvolumes for d19e0c527a04110aa1614afb7907dcc72ad6b433a45dbb0ce542173fd9b76d60: invalid argument', '']
[10/Sep/2023 18:26:41] INFO [storageadmin.tasks:64] Task [uninstall], id: 24f87d71-474f-4941-91ab-a685dbddfb17 completed OK

Both of the directories referenced there exist in “containers”

# ls -l /mnt2/Rockons/containers
total 2340
drwx--x--- 1 root root 200 Sep 11 13:36 141a02b37e4b142beea55d04e1896de8797f64bae10306de54383bc320ac944d
drwx--x--- 1 root root 254 Sep 11 13:50 25e4c392faa04d930daacf6a895c593ba343aac9607b02563386b28a589dd5e6
drwx--x--- 1 root root 172 Sep 11 13:36 cadbeb3e509fc1ac24ff513d3a9be7c9dd1c1f75e2c990ae574dd18f384a3f61
drwx--x--- 1 root root 172 Sep 11 13:36 e28fa7355a4ecdc2577df139737a6649ca73d6e611c29d2009230ed563bc560d

But they don’t exist in …/btrfs/subvolumes:


rockstor:/mnt2/Pool1/Rockons/btrfs/subvolumes # ls -l
total 0
drwxr-xr-x 1 root root 154 Dec 23  2022 0c20c4c70409a8c11a6ea6fcd1d6b977d2bfadd76efd11b156d5dd2476196370
drwxr-xr-x 1 root root 128 Jan 31  2023 413264ffc853e9d4ffc1b194ef6593e065f352509b36b713e1680daaabe11bd7
drwxr-xr-x 1 root root 164 Jan 31  2023 518f3f51e3d9a151589238033c7d7b9ca3f6e8b8b6fbd0075567929c6582f1fe
drwxr-xr-x 1 root root 164 Jan 31  2023 518f3f51e3d9a151589238033c7d7b9ca3f6e8b8b6fbd0075567929c6582f1fe-init
drwxr-xr-x 1 root root 144 Jan 31  2023 59f19d4ace2fbb87de3183199002477cf6e9536f55c5c3a4f617aebf5f5811c5
drwxr-xr-x 1 root root 186 Dec 23  2022 60def5f2c50f8309fd6d15802d6e832e0559c91ea7f010afe24952c6b6719604
drwxr-xr-x 1 root root 174 Dec 23  2022 60def5f2c50f8309fd6d15802d6e832e0559c91ea7f010afe24952c6b6719604-init
drwxr-xr-x 1 root root 132 Dec 23  2022 6b1d981915e05ec1fa6944f92eb86ea2d8b48d1285af7b885d4fdea9c0416109
drwxr-xr-x 1 root root 114 Jan 31  2023 6c7d47a6cabed9a39801e8d29b804ea96c79a09ac0c21368fa99d4076fb52ec2
drwxr-xr-x 1 root root  84 Dec 23  2022 6f5713441f9c62bd1827050202b67ebc0701e4fff59435c18a59eb3606450072
drwxr-xr-x 1 root root 112 Dec 23  2022 76b45f684af186d854e73b0af2d22bb0cd27a29eb75d145ff915c35730f7baf8
drwxr-xr-x 1 root root 154 Dec 23  2022 8868e2ac27cced898e1892b8bc35f0a3b3823fc7f4ce5def63f619713d49cf44
drwxr-xr-x 1 root root 154 Dec 23  2022 916aea235122feaef84d804573dd865334c89cbe6d9bdf032c25e5dc1dcbf980
drwxr-xr-x 1 root root 206 Jan  1  2023 98d52c423ec965abf994bfe5e5624d88b2a06179633b82d0f3ee5651f5d4bf33
drwxr-xr-x 1 root root 206 Jan  5  2023 a2cdae2ed07c8ff0d82256dcfb032e2931172a7145bcd5deede43f9d0d048382
drwxr-xr-x 1 root root 268 Jan  5  2023 a4403080deadf73eac5db2005aa3519ec9fb48e286c22cdec6d1400b4f5087b0
drwxr-xr-x 1 root root 226 Jan  5  2023 a4403080deadf73eac5db2005aa3519ec9fb48e286c22cdec6d1400b4f5087b0-init
drwxr-xr-x 1 root root 132 Dec 23  2022 aa78e35692a9ca7b5b91bf6f0e12144a1560be043de32305d699b17f8184e900
drwxr-xr-x 1 root root 154 Dec 23  2022 afb57e4974af5d01b99f857f63fb06c50e51b937a1733c39515d2b7e71bf9b15
drwxr-xr-x 1 root root 150 Jan  1  2023 b06b718d3be66bec83ed47f1fc34eb373708548f5c5548ae6880f8182f1f2151
drwxr-xr-x 1 root root 206 Jan  1  2023 b8a712e044679c80e7ba7bc4e0e615d521efd03a9ff0c3a9d5ca11f2189d270d
drwxr-xr-x 1 root root  52 Dec 23  2022 b90e4eedcfb2f531f4517eb178734c2ae2879f14c53909a1f5f510cc3f983173
drwxr-xr-x 1 root root 206 Jan  1  2023 b9950e56d967eb89fed65cfd4f239dbfc191429b21843aa98e1b03a64864230b
drwxr-xr-x 1 root root 114 Jan 31  2023 bea3a43c6a06b238e69d1ed4accc12f6d66eaa942a6e11c7f85ce0f4ebdc4fbf
drwxr-xr-x 1 root root 206 Jan  5  2023 bf72ad795fadd33896169268782e2725b303a39112a99a2042c55a0ffa729281
drwxr-xr-x 1 root root 184 Jan  1  2023 c74e5e7db4963d1edf984a2c1680f717d5dc4382260e23688e1dd6c7902cdac6
drwxr-xr-x 1 root root 206 Jan  5  2023 cf0c707d93b67dcc89cb67c607c8114b3ef860833e13f01d65c95f8ec15e0575
drwxr-xr-x 1 root root 234 Dec 23  2022 d19e0c527a04110aa1614afb7907dcc72ad6b433a45dbb0ce542173fd9b76d60
drwxr-xr-x 1 root root 168 Dec 23  2022 d19e0c527a04110aa1614afb7907dcc72ad6b433a45dbb0ce542173fd9b76d60-init
drwxr-xr-x 1 root root 154 Dec 23  2022 e82062427f78d0666265c2815fec46c6bac80b434100be4cb5c603126c2fb9f9
drwxr-xr-x 1 root root 112 Dec 23  2022 ebb2fd0de5a2a7e0e2fa5e5d379faa5fab06ee3adc20b2abade71eeef8da874e
drwxr-xr-x 1 root root 154 Dec 23  2022 f037cb56f45dde71b258cb012cc83b007f5958fd3dea2c789702079b5574b225
drwxr-xr-x 1 root root 134 Dec 23  2022 f30a394e0f14c8d190e8421b6a5b9cdb1be0a6a047b4ec36ae8f4db37acfc7d6
drwxr-xr-x 1 root root 154 Dec 23  2022 f357df010e4b07103147656fc1c617a301aef64c9208871b3d7d8f7e7f21d476
drwxr-xr-x 1 root root 206 Jan  1  2023 f961600215713d31a05f7858c0d6bf87aef24b8c6c817efedb8f29aa380b13b5
rockstor:/mnt2/Pool1/Rockons/btrfs/subvolumes #

So one thought is to uninstall all Rockons, stop the Rockon service, recreate the /mnt2/Pool1/Rockons share (empty), restore the important Emby files from backup, and reinstall the rockons. I’ve only got 4 rockons installed so this is feasible but I’m blocked (I think) by the inability to delete Rockons from the UI.

Or can I simply stop the Rockon service, delete the /mnt2/Pool1/Rockons/containers/141a02b37e4b142beea55d04e1896de8797f64bae10306de54383bc320ac944d (yes, this is the Emby container), and restart Rockons? In this case, Emby should be magically gone from the UI list and can be installed again.

Risk in either case: I don’t know what meta data Rockstor keeps elsewhere and might need to be addressed for this approach to work.

(Still hoping for comments here - thanks).

1 Like

@wdc, it should actually be Emby server - (capital “e” and lower case “s”) same spelling as you see on the Rockon page.

Fundamentally - though of course no fun, if you have other containers already installed - you can stop the Rockon service, delete the rockon root, recreate it, reconfigure/restart the rockon service …
but it seems you want to do a more “surgical” approach…

you could also at the command line check:

docker ps to see whether it continues to run. If it doesn’t you could also see whether you can remove the container and image manually docker rm <container name> and docker rmi <container image name, or id>

2 Likes

Sometimes we are our own worst enemies. Yes, the “s” worked. I still have some more work to do on it but work and football got in the way.

Thanks, @Hooverdan

3 Likes

Working again. I ended up rebuilding the whole Rockstor share.

2 Likes

Since this is the first result in google when one has a perpetual “Uninstalling…” Rock-On, I will dump my solution here:

Since the new versions use poetry, one has to call the script a little differently:
/opt/rockstor # poetry run delete-rockon <rockon_name>

Afterwards you need to press the Update button in RockOns > All tab

5 Likes