Folks,
I have a btrfs raid 10 spread out among 4 ssds. The main drive is
sda. When i reboot my system after the kernel loads i see "a start job
is running for dev-sda1.device. I've let this sit for 3 hours with no
progress into the boot, and it seems to be a race condition of some
sort.
If i restart around 20 times one of them will let the system continue
booting, though i've been lucky enough to have it happen in 4. I have
the following systemd unit:
/etc/systemd/system/local-fs-pre.target.wants/btrfs-dev-scan.service
[Unit]
Description=Btrfs scan devices
Before=local-fs-pre.target
DefaultDependencies=false
[Service]
Type=oneshot
ExecStart=/sbin/btrfs device scan
[Install]
WantedBy=local-fs-pre.target
These 4 disks were joined to /dev/sda
btrfs device add -f /dev/sdc /dev/sdd /dev/sde /
followed by
btrfs balance start -dconvert=raid10 -mconvert=raid10 /
my fstab contains:
/dev/sdf1 /boot vfat defaults,noatime,discard 0 2
/dev/sda / btrfs defaults,noatime,discard,compress=zlib,autodefrag 0 1
/dev/sdb /home/hometheater btrfs
defaults,noatime,discard,compress=zlib,autodefrag 0 2
Any ideas would be appreciated. For what it's worth i have smartd running.
Thank you,
--
Bearcat M. Şándor
Feline Soul Systems LLC
Voice: 872.CAT.SOUL (872.228.7685)
Fax: 406.235.7070
--
To unsubscribe from this list: send the line "unsubscribe linux-btrfs" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html