# Problem w/ RAID

## denudar

Hello,

 I installed gentoo on a DELL PowerEdge T 150 http://www.dell.com/content/products/productdetails.aspx/pedge_t105?c=us&cs=04&l=en&s=bsd[/b] w/ 2 SATA hdd's in RAID 1.

 I followed the Gentoo Software RAID guide on the wiki, and everything went smooth. I have all partitions mirrored (including swap) but it seems to be a problem when I reboot the machine, there is a message " unable to stop array md3 (aka, the root partition) device or resource busy"

 Here is my fstab:

```

/dev/md1                /boot           ext2            noauto,noatime  1 2

/dev/md3                /               xfs     noatime,nobarrier       0 1

/dev/md2                none            swap            sw              0 0

/dev/cdrom              /mnt/cdrom      auto            noauto,ro       0 0

#/dev/fd0               /mnt/floppy     auto            noauto          0 0

/dev/md4                /mnt/share      xfs    defaults,rw              0 0

```

Here is my proc/mdstat

```
Personalities : [raid1] 

md1 : active raid1 sdb1[1] sda1[0]

      40064 blocks [2/2] [UU]

      

md2 : active raid1 sdb2[1] sda2[0]

      2000000 blocks [2/2] [UU]

      

md3 : active raid1 sdb3[1] sda3[0]

      45897600 blocks [2/2] [UU]

      

md4 : active raid1 sdb4[1] sda4[0]

      196201728 blocks [2/2] [UU]

      

unused devices: <none>

```

And this is my mdadm.conf

```
ARRAY /dev/md1 level=raid1 num-devices=2 UUID=1910d34d:06bafa78:f08b425c:e12f1880

ARRAY /dev/md2 level=raid1 num-devices=2 UUID=11970775:79940c40:88974cbc:660f0820

ARRAY /dev/md3 level=raid1 num-devices=2 UUID=6446b4c6:66efa576:90a3fa26:d7f043ab

ARRAY /dev/md4 level=raid1 num-devices=2 UUID=7d0e6658:1092a13f:0dd2a9b6:29fb8558

```

And something I find weird in dmesg:

```

md: Autodetecting RAID arrays.

md: Scanned 8 and added 8 devices.

md: autorun ...

md: considering sdb4 ...

md:  adding sdb4 ...

md: sdb3 has different UUID to sdb4

md: sdb2 has different UUID to sdb4

md: sdb1 has different UUID to sdb4

md:  adding sda4 ...

md: sda3 has different UUID to sdb4

md: sda2 has different UUID to sdb4

md: sda1 has different UUID to sdb4

md: created md4

md: bind<sda4>

md: bind<sdb4>

md: running: <sdb4><sda4>

raid1: raid set md4 active with 2 out of 2 mirrors

md: considering sdb3 ...

md:  adding sdb3 ...

md: sdb2 has different UUID to sdb3

md: sdb1 has different UUID to sdb3

md:  adding sda3 ...

md: sda2 has different UUID to sdb3

md: sda1 has different UUID to sdb3

md: created md3

md: bind<sda3>

md: bind<sdb3>

md: running: <sdb3><sda3>

raid1: raid set md3 active with 2 out of 2 mirrors

md: considering sdb2 ...

md:  adding sdb2 ...

md: sdb1 has different UUID to sdb2

md:  adding sda2 ...

md: sda1 has different UUID to sdb2

md: created md2

md: bind<sda2>

md: bind<sdb2>

md: running: <sdb2><sda2>

raid1: raid set md2 active with 2 out of 2 mirrors

md: considering sdb1 ...

md:  adding sdb1 ...

md:  adding sda1 ...

md: created md1

md: bind<sda1>

md: bind<sdb1>

md: running: <sdb1><sda1>

raid1: raid set md1 active with 2 out of 2 mirrors

md: ... autorun DONE.

Filesystem "md3": Disabling barriers, not supported by the underlying device

XFS mounting filesystem md3

Ending clean XFS mount for filesystem: md3

```

There's also a problem when starting local, it takes some time (about 10 seconds).

Now that's a long post...

----------

## tobr

 *denudar wrote:*   

> I have all partitions mirrored (including swap) but it seems to be a problem when I reboot the machine, there is a message " unable to stop array md3 (aka, the root partition) device or resource busy"

 Well, I get the same with a similar setup for "/". Never had any problems with it.

I think the reason is simply that the root partition is in use until the very end of the shutdown process and the system tries to unmount "/" before it is really possible. If everything is sync(2)ed before the disks are shut down there shouldn’t be any problems.

----------

## drescherjm

 *Quote:*   

> there is a message " unable to stop array md3 (aka, the root partition) device or resource busy" 

 

I see this a lot (on many machines) but in my opinion it is harmless.

----------

## djoe420

 *drescherjm wrote:*   

> 
> 
> I see this a lot (on many machines) but in my opinion it is harmless.

 

I'm having the same problem, and in my case it is a problem because the next time I boot the kernel can't detect the devices I use for LVM.

Then I reboot it for a couple of times and at some point it works again!

I'm using sw RAID1 on /boot and / and in couple more partitions where I also have LVM2 managing the space.

I know that this thread is quite old, but if anyone has any clue about this, I would appreciate it.

----------

