The procedures for setting up a campus cluster are nearly the same as those for local clusters, with the following differences:
A campus cluster has the nodes located in separate buildings. Therefore, the hardware setup requires SAN interconnects that allow these connections.
In a campus cluster, each node has its own storage array rather than having a shared storage array between the two clusters.
Both local clusters and campus clusters have SFW dynamic disk groups and volumes, but the volumes on each campus cluster node are mirrors of one another.
Each disk group must contain the same number of disks on each site for the mirrored volumes.
More information is available on disk group and volume configuration.
Although a campus cluster setup with Microsoft clustering can work without InfoScale Storage, InfoScale Storage provides key advantages over using Microsoft clustering alone. Through a dynamic mirrored volume that functions on multiple disks across multiple sites, SFW protects the quorum resource in the cluster from being the single point of failure in the cluster.
Most customers use hardware RAID to protect the quorum disk, but that does not work when a natural disaster takes down the primary node and its attached storage. If the quorum resource is lost to the cluster, the cluster fails, because none of the cluster servers can gain control of the quorum resource and ultimately the cluster. Microsoft clustering alone cannot provide fault tolerance to the quorum disk.
The following figure shows a Microsoft campus cluster configuration with mirrored storage across clusters and a mirrored quorum resource. The 4-way mirrored quorum has an extra set of mirrors for added redundancy.