First, we are evaluating Open-E in lab.
We've created three logical volumes as shown here:

The 247GB is file based and is used as the datastore for the vSphere host and we are not having any problems using it for storing and running virtual machines and isos.

We wanted to benchmark and test RDM (raw device mapping) disks in VMware as both file and block devices. So the two 40GB logical drives were created for this purpose, with lv0001 being block and lv0002 being file.

I've successfully added these as RDM disks to guests in vSphere, but the guests will not boot and hang at the VMWare BIOS screen.
If I replace the RDM disk with a VMDK on the datastore I no longer experience this failure.

This occurs even if I select "Force BIOS Setup" in the virtual machines option screen.
This also occurs regardless of if I am using the file or block based device or a physical or virtual RDM.

Further, the guest can't be easily powered off once this has occurred, and I have to reboot the host to continue.

Any ideas?

Here is what I see when I power on the guest: