How to avoid the I/O Connectivity Meltdown Caused By Server Virtualization?

Virtualized servers place new demands on the server I/O infrastructure. Because these servers run multiple applications, they require more network bandwidth and need more network connections to handle the increased data flows. And because all resources are shared among multiple applications, ensuring the performance and availability of critical apps becomes more complex. In traditional server […]

Virtualized servers place new demands on the server I/O infrastructure. Because these servers run multiple applications, they require more network bandwidth and need more network connections to handle the increased data flows. And because all resources are shared among multiple applications, ensuring the performance and availability of critical apps becomes more complex.

In traditional server environments these issues are resolved by resource segregation. Each server runs only one application, and each is provided with separate I/O resources. This ensures security as well as performance: servers are connected only to the needed networks and no others. These physically distinct networks isolate devices from intrusion threats, denial of service attacks, or application failures on other servers.

This segregated model changes with the advent of server virtualization. With virtualization, IT managers create a flexible pool of resources that can be deployed as needed. Any server can ideally run any application, which means that a single server now requires sufficient connectivity for all of the applications it hosts.

Full Article

Virtualization, Connectivity, Virtual Server, Trobuleshooting, Knowldegebase, Article