How to calculate cost/benefit of hardware expenditures? (move to virtualization, upgrades, etc.)
Preamble: As part of the acquisition of a smaller development company we have acquired a new office. Many people were transferred to head office, but 'new' office will retain a small team of 4-6 developers and business analysts, along with other misc administration staff.
As part of that process we have ended up with several late 2005 model HP servers that are currently not being utilized which I feel could function as a suitable platform for UAT and source control, et. al. until next year when budgets are adjusted and a strong case for virtualization can be made.
The Problem: The existing sysadmin is technology gun-shy and spends most of his time focusing on job security (i.e. doing as little as possible) He has been resistant to my suggestions that upgrading these boxes would provide a minimally suitable infrastructure for the development team for the next year (I revised
this value from 1-2 years based on feedback I've received).
The sysadmin's position is that we would be better off running things locally (UAT, etc.) on our desktops, instead of putting in the work to perform upgrades. This leaves the development team in limbo, with no guarantee of virtualization next year because the sysadmin lacks the experience and confidence to implement a solution.
This seems irrational in the face of a minimal, one-time, < 1k$ expenditure. The justification seems obvious to me as a short-term fix, but I want to be sure I am on base and not just desperate for a "fix".
The Question: What is your general process for calculating the cost/benefit for hardware expenditures, and how do you typically present this information to upper management to justify expenditures?
Additionally, how would you make a case for virtualization?
And, how much experience do you feel is required for a sysadmin to roll over to a VM environment?