Released in testing channel updates on 19th October 2023
Apparently this is our 100th GitHub Release
Overview
DEVELOPERS & TESTERS - we are more like mid cycle in our current testing phase now.
Thanks to @Flox (co-maintainer) & @Hooverdan (forum moderator and all around help).
Here we have a continuation of our technical dept resolution. Bit by bit.
We also have some improvements re Rock-on definition retrieval. Thanks to @Hooverdan & @Flox working with forum members @greven, @jimla1965 to help to highlight and investigate this issue. We should now take around 1/3 of the time, and we run this task in a better thread type. If you had this problem before, please report here on the forum if it is now fixed for you.
And … drum roll …@Flox has done our initial Tailscale intergration.
This one will definitely need some domain expertise (that I don’t have) so if you use Tailscale, please given this new service a try. It will not work without first by-hand following an upstream Tailscale repo addition and install. See the following comment from an issue in our rockstor-install for install instructions: Pre-install Tailscale · Issue #142 · rockstor/rockstor-installer · GitHub
Released in testing channel updates on 16th January 2024
1st Stable Release Candidate (RC1)
As can be seen by these changes and those preceeding it in our ongoing 5.0.* testing phase: we are in process of updating our base dependencies. And hopefully fixing all that ‘falls’ in the process. This release is marked RC1 as we now need wider testing to begin our push towards a more production/tested release that we can promote as Stable. Our last of which was 4.6.1-0 (after 4.1.0-0 before it), and our next intended Stable release version will be 5.1.0-0, if our plans pan-out.
Overview
We now run on Python 3.11 and Django 4.2 LTS; and have updated all our Python dependencies to their latest versions, as of each updates implementation. This was the primary, and rather ambitious goal of this current testing phase.
There are inevitably regressions, with such fundamental updates, and one particularly gnarly one has been our replication code: this had many Py2.7isms and we paid for this in the delay getting to this release. Thanks to forum member @niko for their reporting on this regression. It is now expected that we have restored our prior stable version replication function with this release. There will inevitably be improvements to be made but initially we needed what we had in our last stable release: at least.
As usual I would like to thank all testers, developers, & reporters, and specifically @Flox and @Hooverdan for their stellar efforts across the project.
Caveats
With this release we have now adoped a proper secrets management system. As such, the default/predefined/special-case cliapp credentials had to be redefined. This will break all existing replication and is unavoidable with our other ongoing update orientated goals. However replication in this testing phase has been broken for some time now. But for those intending to update to this release who rely-on, or are willing to test replication: you will need to re-establish any existing replication configuration. Just start over to be sure of a clean start.
Released in testing channel updates on 29th January 2024
2nd Stable Release Candidate (RC2)
Note, there were a few surprised reporters here on the forum that our TESTING-channel-only release of RC1 (5.0.6-0) rpm failed to update from a number of older releases. This is entirely normal, every rpm we publish is tested before release for it’s fresh install capability (no .initrock so DB wipe). But updates, from any prior install (regardless of channel), are in no way guaranteed/tested/indicated-as-working. But given that; as we enter the last stage of each testing phase (RC* releases) we begin, resources/contributions/time allowing, to make best efforts on updating from the last Stable release only (currently 4.6.1-0). But given the extensive changes we have made, and usually make in each testing run, this often also ensures updates from any prior testing version in the current phase: but only later on in said phase.
Overview
With the above as context, we got our much appreciated high quality feedback from forum engaged testers/developers to further our collective effort. Special thanks to @KarstenV for their patient and productive persistence engaging with our stalwart Open Collective team members of @Flox & @Hooverdan. The result diagnosed some subtle issues (cause not effect ) around our very first Poetry update combined (happenstance - looking on the bright side) by way of yet-another Django update; with a PostgreSQL format incompatibility.
We have now, in this testing phase alone, updated virtually every dependency we have; some, like our Python and Django versions, as many as 3 times each. This step-by-step approach was taken to maintain a fresh install rpm capability during all these changes. Greatly easing community engagement/feedback on our priorities and progress. And further testing our developing update mechanisms. The testing channel is our way to reach production quality releases, consequently the first stable release rpm is always also published in testing first (at the end): as a result of our existing methods, unchanged for the lifetime of the project.
All 3 functional changes in this release pertain to testing-channel-only rpm update regressions introduced/reported/uncovered in our last testing rpm. Thanks also to @hmete1, @Warbucks, and @phemmy22 for their positive engaging on these and other issues here on the forum. We still have some ‘5.1.X-X Stable release’ Milestone issues to address, and are open to adding smaller ones if well defined.
What’s Changed
Failure to remove legacy poetry version in 5.0.6-0 rpm #2782 by @phillxnet in #2784
Released in testing channel updates on 12th Feburary 2024
3rd Stable Release Candidate (RC3)
Thanks to @Mark93, @Warbucks, @DrHolzer, @Tex1954, @McFaul for some excellent reporting, diagnostic, & confirmation efforts here on the forum with the venerable @Flox & @Hooverdan . We have here a specific fix for an SMB export regression introduced in 5.0.6-0 and persisted in 5.0.7-0 testing channel releases. The fix rolls in a more centralised approach to managing our environmental variables under Poetry; the root cause of the issue indicated: lack of universal access to a required environmental variable associated with our new-in-5.0.6-0 dedicated secrets management.
We also have an unburdening of our prior excessive use of rpm’s %posttrans scriptlet. We had rather over-stretched its intended use I think. Hence the adoption of a new rockstor-build.service to more easily surface issues associated with our build.sh script: in-turn associated with a number of update failures experienced in this and the last testing phases. Future such issues should now be easier to identify and resolve on testing instances. There has also been an improvement in our debug content of the now well appreciated poetry-install.txt file!
For all those involved in the current SMB export investigations: please also note the following ongoing (open) issue:
Released in testing channel updates on 19th April 2024
4th Stable Release Candidate (RC4)
As usual: thanks to all those who have helped here on the forum, in GitHub issues (i.e. GitHub user ubenmackin’s estensive reporting/interactions re “Disk miss-attribution to ROOT”) and in side channel chats/support email.
The main fixes here are hopefully the last remaining issues re our multiple Python updates within this single testing phase (now on Py3.11), a re-vamp/extensive clean-up/simplification of our lowest level of drive/pool management, and a lot of background maintenance updates to our first-and-more order dependencies. Plus a few Web-UI tidies.
An interesting, and long awaited (by me) improvement that has come from the low-level disk management re-do, is our new stance on the ROOT (OS/System) Pool. We no longer recommend that this is ever imported/used. It import option is now marked as “Advanced Users Only”: however this will only be evident on completely fresh install. I’ll try and cook-up a DB command to return existing installs to this state: as it has some significant Web-UI speed/usability improvements associated with it. The ROOT pool is never really managed by Rockstor’s Web-UI and that has to always be the case: and we should strongly discourage its use from now on. This new stance further enforces a separation of OS from Data. And the ‘ROOT’ pool that we inherit from upstream is actually pretty complex given we enforce a boot-to-snapshot capability. And our ‘view’ of it has always been very simplified. Again, existing installs updating to this version will maintain their existing ‘ROOT’ Pool managed status. But plan on this not being the case for fresh installs resulting from all future installers.
Released in testing channel updates on 17th June 2024
5th Stable Release Candidate (RC5)
Our usual thanks to all those who have tested and reported here on the forum to get us ever close to our first 5.1.x-x Stable release. Special mention this time to our intrepid forum moderator and all-over contributor @Hooverdan who found a rather prominent bug that crept in at 5.0.6-0 and remained through to 5.0.9-0: see changelog below for details. As always we have also @Flox to thank for yet more investigations & interpretations.
Due to the imminent Stable release, and the fact that our recently refreshed downloadable installers (including one based on openSUSE Leap 15.6) contain this bug where, an upstream default wasn’t, I’ll get to publishing quickly.
Released in testing channel updates on 3rd July 2024
6th Stable Release Candidate (RC6)
Thanks to all who have helped us to get this far. Special thanks to forum members @coffax, @McFaul,
& @mpidala for their forum interactions to help pin-down the bugs that are now fixed in this release. As always thanks also to @Flox co-maintainer, and @Hooverdan forum moderator & all-around core contributor; both of which helped with the fixes here also.
What’s Changed
5.0.9-0 & 5.0.10-0 lsblk whitespace only values - not enough values to unpack #2853 by @phillxnet in #2855
Released in testing channel updates on 11th July 2024
7th Stable Release Candidate (RC7)
Thanks to all who have helped us to get this far. Special thanks to @Flox co-maintainer, and @Hooverdan forum moderator & all-around core contributor.
Bar some recent forum reports - that have yet to be resolved fully to the GitHub issue stage, we have now cleared our next Stable Milestone: alas we have an outstanding Pi4 service fail on first reboot (noted on downloads page), but works if then started by-hand! And a curious Web-UI scrub status unknown issue - but the scrub is non-the-less executed OK! But as always: bit by bit.
Released in testing channel updates on 17th July 2024
8th Stable Release Candidate (RC8)
As always: special thanks to @Flox co-maintainer, and @Hooverdan forum moderator & all-around core contributor.
@Tex1954 thanks for the “unknown” scrub reporting: should be fixed now. We had a hick-up in btrfs-progs version sensing: now improved to be more robust.
@Stevek thanks for reporting you findings re the Share resize: It related to an artifact of imported pool: but all good as we were due for some quota support improvements.
I would like to draw attention to the ‘13’ minor version designation here: please test extra thoroughly .
Released in testing channel updates on 27th August 2024
9th Stable Release Candidate (RC9)
As always: special thanks to @Flox co-maintainer, and @Hooverdan forum moderator & all-around core contributor.
@andyz Thanks for the detailed report re the system reserved home share issue, now fixed in this release.
@MyOwn Thanks for the Pi4 reboot confirmation. We finally tracked this down to extra systemd config required as a result of no persistent real-time-clock. That was a tricky one.
Counter to expectations, our prior 5.0.13-0 release was actually drama free: so we are definitely getting there . Only minor / cosmetic / maintenance changes in this release, but we do have some long awaited multi container Rock-on fixes that we squeezed in to accommodate a specific Rock-on in-the-works. More on this when the time comes.
The remaining goals for the current testing channel phase are now minimal. Only show-stoppers will be entertained prior to some more cosmetics before we have our long awaited 5.1.x-x Stable tag.
A special testing request goes out to those familiar with our Share replication subsystem. Our “Routine update of dependencies” introduced a new major version of pyzmq. This is as-yet untested: but the prior version worked just dandy .
Released in testing channel updates on 19th November 2024
10th Stable Release Candidate (RC10)
As always: special thanks to @Flox co-maintainer, and @Hooverdan forum moderator & all-around core contributor.
@cellisten Thanks for a follow-up fix re leading spaces in lsblk values: hopefully we are now back where we were on this front, but with way more maintainable code post the major rework of that whole area that took place in the current testing phase.
@simon-77 Thanks for the detailed report re our NFS restoring as read-only. That lead, in turn, to a few improvements in our NFS code, including input validation improvements and a re-vamp of out test coverage which had fallen into disrepair.
@Stevek Thanks for your patient testing and reporting re a newer btrfs issue that seems to have thrown our replication. All in we have now, in this release resolved one of the issues found. But a quota one remains without a reliable replication: so I was unable to reproduce in order to prove a fix. I suspect it requires a non-trivial quota arrangement. But irrispective we now have one fixed so the remaining, if it still surfaces, should now be easier to narrow down. Leap 15.6 came with a way newer kernel so we have had to adapt to some of the newer btrfs behaviour there. Bit by bit.
This release has a lot of entries but many are cosmetic in nature, along with additional dependency pinning, ready for the next Stable release. We have also now addressed the confusing and contradictory tables re Rock-on pre & post config summary tables. That one has been hanging around for some time now. As always more improvements to be had but again, more eyes and fingers needed here.
And again: a special testing request goes out to those familiar with our Share replication subsystem. Re the issues generated from @Stevek testing reports.