Recover from drive failure on raid10

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hello,

i played a little bit with btrfs and raid levels, but now i stuck on
recovering if the first disc is missing.

First i tried removing the second disc -> works!

# mkfs.btrfs -m raid1 -d raid1 /dev/sdb /dev/sdc
# mount /dev/sdb /mnt
- copy some data
- shutdown, remove /dev/sdb, add a empty /dev/sdb, reboot
# btrfs device scan
# mount -o degraded /dev/sdb
# btrfs device add /dev/sdc
# umount /mnt
# mount /dev/sdb /mnt -> EVERYTHING OK AGAIN

But now i removed the first (sdb) device. And the steps above dont work 
anymore. 

If i try to:
# btrfs device scan (found my raid array)
# mount -o degraded /dev/sdb 
mount asked to specify the filesystem type, this was expected because
sbd is empty. Now try with sdc
# mount -o degraded /dev/sdc
and i got a bad superblock message.

What should i do now?

Thanks!
Nino

--
To unsubscribe from this list: send the line "unsubscribe linux-btrfs" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Filesystem Development]     [Linux NFS]     [Linux NILFS]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]

  Powered by Linux