I’m at a complete loss and have tried everything, but for some reason I just cannot get Rockstor installed on this PC.
I have searched the internet and the forums but also did not find anything related to this.
The installation finished successfully, but after reboot I get the error “No bootable device found”
If I select BIOS boot options and specify the installation disk, then it boots, but obviously this is not ideal.
I have tried:
With legacy and UEFI installations
I have even removed all the hardware and disks to only have the installation disk present
I have switched SATA cables and places on the motherboard, no dice
I have cleared the CMOS multiple times by now
It is almost like GRUB is not fully installed.
Could anyone assist me with some more troubleshooting tips please?
Installation media: Rockstor Leap 15.3 Generic
Some more info:
Intel DP67BA motherboard
120GB Patriot Burst Elite SSD
And as you see from that issue it works just fine on the potentially idealised UEFI only KVM proof of concept. But this does not appear to affect all systems. Some UEFI systems seem to boot just dandy.
Essentially we rely on our upstream installer builder kiwi-ng for this functionality. And our own kiwi-ng config here:
looks to be good, and as mentioned, we don’t appear to have had many reports of this failure. One of the core Rockstor developers @Flox has a machine that is similarly affected. I have access only to the KVM UEFI where our installers work just fine. All a bit of a mistery that has had similar reports from completely different OS builds in the kiwi-ng upstream repository. But it still may be that we are doing it wrong. For example you could try building your own installer from the rockstor-installer repo and try other settings with for example this:
firmware="efi"
changed to:
firmware="uefi"
Which I think adds secure boot functionality. Again this is all handled by our installer builder and so is in their hands. I also suspect they prove functionality in uefi via the KVM method that works for me too.
Take a look at that issue as the original reporter found a work around of simply re-installing grub before the first reboot at the very end of the installer. The command used was:
But you may find that no options are actually required.
Let us know how you get on. As stated this is currently a major pain and we have been watching upstream for some time as I suspect there is something more they need to do to cover some hardware quirks. I’ve linked to our own report of this in for example the following issue in their repository:
Debian OEM image is not rebootable · Issue #1991 · OSInside/kiwi · GitHub which was on a debian base of all things. It’s definitely quirky but what else do we expect from uefi, it’s code base is larger than the linux kernel you know!!!
And all manufactururs have their own interpretation on exactly how or if to register/boot/allow-boot of the available options.
Have you tried disabling secure boot by chance. Not always an option but was promised as such in the early days of uefi introduction.
Let us know how you get on with that work around. We are not uefi efi experts and are basically awaiting a fix or work around ourselves. And this may end up being a force re-install at the end of the install or some such hackery. But if it works for some setups (KVM proof in issue) then why not others?
Incidentally we have this in our short-list for “First v4 stable release” of stuff to address in our installer here:
We have now reached v4 stable rpm (released yesterday and soon to be promoted to stable repo):
so hopefully we can tend to the remaining issues in the installer repo. But we do need help on how to sort this seemingly system dependant uefi failure.
Hope the above helps. And thanks for a nicely detailed report.
Just a quick update, unfortunately not a happy one.
I could not get the installed to correctly install GRUB, and I’m quite miffed at this since I really love Rockstor.
I now unfortunately have to let it be for a while, maybe wait for a new release to come out and then try again. In the mean time I’ll have to use a competitor…
Sorry to hear that!
Quick question, did you remove the install media after completion. I did run into a similar issue when I rebooted with the ISB stick still plugged into the system
The USB installation media is unplugged each time.
I would not fret about it too much, the hardware I’m trying to install it on is one of the very first Intel 1155 EFI motherboards I use for a SOHO solution. I have searched for a BIOS update but unfortunately Intel does not support the board any more and I cannot find an archive of the motherboard’s BIOS anywhere. Maybe there was a BIOS fix to make the board play nice with EFI installation media, I will never know…
i know this thread is aging but I’m having the same issue. I’ve tried reinstalling grub to no avail. Ironically, i had a v3 stick laying here and just for kicks ran the installer and it was fine. found the boot drive and booted as normal. I’m not sure what to do aside from ripping hair out. so however the installer was set up for the centOS version it seems to be fine. but no dice for the opensuse version. seems like every time i want to really try this NAS something like this happens and I wind up giving up on it. I’m looking for a reason to leave truenas for various reasons but it’s the only thing that seems to be stable for me. I may wind up throwing unraid on it just to try it out. maybe a future release will have this issue ironed out.
@jjairpants Welcome to the Rockstor community forum.
It would help to have a little more detail here on your v4 experience, i.e. how does your experience differ from that show in the installer guide here:
Use cursor keys to highlight, then the “Enter” key to select. Only devices less than 5000 GB (5 TB) are shown. Larger disks are assumed to be data disks.
Does that tally with info your have there but haven’t detailed here. Always best to pop in some details to help those here help you. If that is the case, then you will need to take grub up on it’s “edit” offer and remove that safe-guard. It used to be much smaller but it ended up catching far too many folks. We favour having a dedicated smaller system drive to encourage OS and data seperation, plus we still don’t offer proper redundancy on the system drive as per the following recent forum thread discussion:
The bit you would need to remove on the kernel command line would be the following from our installer config GitHub repository:
“rd.kiwi.oem.maxdisk=5000G” from the appropriate profile section. Should work via a live grub boot edit option on any existing install however.
Let us know if this was it. We need to add documentation on this front I think. Safetey measure or otherwise it’s frustrating to come across it without knowing why. We could link to a short how-to from the section I referenced above. But that’s a might large system disk already (> 5 TB) just to hold a few Gs for system OS files.
no, the boot drive shows up in the installer, sorry I was not clear about that. it installs just fine and before the reboot i can reach the web interface and everything works as normal. I tried to install grub from that point after the install when the command line is still up but it didn’t seem to have any effect. as a last ditch effort i copied the boot folder and the EFI folder from the installer to the boot drive and i still could not get it to boot. not sure why the installer will boot just fine but when it’s done and you go to do the intial reboot it can’t find the boot device. I ran across other threads that showed adding some aguements/options to the grub2-install command. that didn’t work either. I have secure boot disabled on this machine. Like i mentioned before, the centOS install boots fine after install. not sure what changed between installers besides the shift to suse but it doesn’t like it.
This is likely due to the bios allowing more lenient ‘removable’ device boot options, but applies more of the secure boot or efi limitations on the final hdd boot. We have seen this before, and it’s actually an ongoing issue where efi is causing some issues, yet we have proven our compatibility via a ‘factory’ example of efi within qemu. Take a look at the following issue where there may be work around for you. As stated we appear to be fine on some systems and not fine on others, depending on the particular UEFI implementation and myriad of potential variables.
As you see from that repo we use openSUSE’s kiwi-ng to build our new v4 “Built on openSUSE” installer. It’s what it as build for. But there remains some inconsistencies in UEFI particularly and we likely have some short falls on our config side also. But as you also see in that issue we install and boot the resulting install just fine on a UEFI only virtual machine so in some settings at least we do work, hence the inconsistencies claim here .
Have a read and let us know what worked for you, I know that upstream are looking to newer systemd boot capabilities that may help in this case but alas this is not our expertise and we await further cross hardware compatibility from our existing kiwi-ng config or it’s improvement over time both on our side and upstream. Kiwi-ng has itself had similar reports, but again this is not our expertise.
Thanks for the extra more exact info. Always helps with getting to where the fault may be and at least, hopefully, providing either work a-rounds or fixes. On that front, you may have a quick-fix by switching to a more permissive boot option within the bios, although the installer may be sensitive to this so a re-install may also be needed. All rather a mess the uefi boot, did you know that typical UEFI code in the bios/motherboard-firmware is now larger than a regular linux kernel !! I should dig up the link for that one.
Yes, there is nothing in common really bar the use of grub. We had a custom rolled installer before build via this now legacy repo:
and now we resource the excellent kiwi-ng from the openSUSE team via the already referenced repo of our config for this tool here:
Hope that helps. And likely you are on the correct path, at least for now, regarding using the initial installers later kexec transition to the installed kernel to affect a re-install/re-config of the installed grub. That is, as you surmised, the failing point here. But only on some but not all uefi systems. We have had no reports of old style bios boot setups failing. But again, a re-install may be required if you change firmware boot options.
well that’s interesting. what i may do is try to install it to an external and see if it cooporates. that helps in understanding what’s going on. if it’s more “lenient” on usb drives i may be able to get it going. I’m fine with it booting off an external. my current set up is that way. i just had a drive to spare in there so i thought i’d use it. I’ll post back when i get a chance to dive into it a little more.
—Update, installed it to an external usb drive. still no dice. even if i manually select it in bios boot menu. still does the same thing. no boot device found. I really don’t have the time to keep playing with it , i’ve already spent way too much time with it trying to sort this out. so i guess i’m going to have to move on to something else. i really wanted to take advantage of the way it utilizes btrfs and software raid. but i guess the closest i’m going to come to that is unraid.
The alternative i guess would be to try and install full blown opensuse and rockstor on top of it. but i won’t be as light wieght. but i could try that.
----Update 2. installing opensuse gave me the same result. boots from the flash drive but won’t boot from the newly created boot drive. I know it isn’t the drive or something like that because i have an internal with ubuntu on it and it boots fine. I’ve also tried it with several different drives all the same result. so i can only assume that whatever is the issue openesuse’s build is at the root of it. I’m officially done with it i think. you guys can get after opensuse to fix their installer.
The it’s in this case was the motherboard/system bios, not our installer. But you do say you tried manually booting via the optional boot menu which is often the thing that allows other boot options.
OK, so we have a shared problem with openSUSE not surprising. Why don’t you drop the uefi boot, just for now, in the motherboard firmware, so that it does legacy boot on all devices? That should get you up an running until we & openSUSE account more for whatever quirks your motherboard has regarding qualifying as a valid boot option.
This may be just down to the more popular distro ubuntu in this case, using a certificate (though you said you have tuned off secure boot) or some uefi shim that your system recognises, and openSUSE is not recognised.
Or you could try adjusting your motherboard bios/uefi firware settings to be more compatible with openSUSE, i.e. try a legacy boot option (bios or whatever). Thanks for the feedback, frustrating I know, and yes we have, in the sited issue, linked to kiwi-ng with our proof of boot in a uefi only settting along side known failure reports. All down to uefi’s non standards, or varying implementations by manufacturers I’m afraid. Massive complexity never helps on that front. We are following kiwi-ng’s expertise on this one as it’s really out of our depth, hence the adoption. But if your old v3 worked then our new one should also as it was non uefi compatible so only installed as bios boot. Hence my suggestion on that front regarding adjusting your systems boot compatibility within the motherboard options and doing a re-install. It may open up many other options for you also.
Hope that helps and good luck with your NAS adventures to come.
I wasn’t trying to get it boot in uefi mode. I had secure boot disabled and legacy boot turned on. That’s why Ubuntu and the older version of rockstor booted fine. It’s just baffling as to why, even in legacy boot mode I still get the “no boot device found”. That’s what I’ve been trying to get around. I did some poking around. I took the external and plugged it into an older machine I had and it does indeed boot. So there is something odd going on with that motherboards bios maybe? I’ll check hp and see if there is a firmware update for the bios. But other than that I’m probably at a loss with it. I only tried the uefi option as a last ditch effort to see if I could get it to boot. Wasn’t holding my breath there. But it was worth a shot.
---- Checked for updates to the bios, it was up to date. Went back into the bios, reset to factory defaults. then went back in and disabled secure boot and reenabled legacy. still no dice, then i went in and checked the TPM settings. something i had not thought of. it was set to hidden, but what i didn’t realize is that if you set it to visible another menu comes up that asks if you want to disable TPM. very counter intuitive. so I disabled TPM, then set it to “hidden” and tried one more time to boot… and wouldn’t you know it, it actually booted from the UEFI usb. seems to be up and working. but i thought i would update this for anyone else who has lost countless hours of sleep fooling with it! It may different for others but it seems it was that TPM setting that was causing it the whole time.