vSphere 4.1 RDM unable to boot VM
First, we are evaluating Open-E in lab.
We've created three logical volumes as shown here:
http://i1192.photobucket.com/albums/...6_13-53-52.png
The 247GB is file based and is used as the datastore for the vSphere host and we are not having any problems using it for storing and running virtual machines and isos.
We wanted to benchmark and test RDM (raw device mapping) disks in VMware as both file and block devices. So the two 40GB logical drives were created for this purpose, with lv0001 being block and lv0002 being file.
I've successfully added these as RDM disks to guests in vSphere, but the guests will not boot and hang at the VMWare BIOS screen.
If I replace the RDM disk with a VMDK on the datastore I no longer experience this failure.
This occurs even if I select "Force BIOS Setup" in the virtual machines option screen.
This also occurs regardless of if I am using the file or block based device or a physical or virtual RDM.
Further, the guest can't be easily powered off once this has occurred, and I have to reboot the host to continue.
Any ideas?
Here is what I see when I power on the guest:
http://i1192.photobucket.com/albums/...6_14-34-20.png
Problems with RDMs in VMware VMs
We have exactly the same issue as the users above.
Environment:
Dell 2950 v3 with Open-E v6 latest build
Dell 2950 v3 VMware Hosts with VMware vSphere 4.1i Enterprise
Usage of a Raw Device Mapping hangs the VM that has the RDM attached. (virtual and physical rdm same issue)
Rebooting the VMware host releases the lock and another power on of the VM with the RDM makes the VM hang again.
When we're using the same volume and format it as a VMFS-3 volume no problems exist at all.
Some help from open-e would be appreciated.
Thanks in advance