Funding Equation for University Computing June 3, 2008Posted by gordonwatts in computers, physics.
It used to be that when you wanted some computing at your university you got the computers and then setup a room somewhere, plugged the network, hired some undergraduate to manage the things and you were off to the races.
That was then. It is different now, and it is getting more different as time goes on.
Several factors are coming together. Perhaps in University research the major driving forces is the overwhelming need for CPU cycles. Those cycles are no longer cheap, either. When you need a lot of computing power you can’t just buy that 500 buck computer off the shelf. You have to buy the 1U pizza boxes, stack them into special racks, run special power feeds, special room cooling, etc. This power and heat density is another driving factor. To do this you need a room where you can install all of this – with the power and the air conditioning. Very few buildings not designed for those sorts of loads have it – and retro-fitting the buildings – even ones just 30 years old – is amazingly expensive. Your other choice is to buy servers at a lights-out facility (a data-center). No matter how you do it, it cost money. A lot more than it used to.
This is affecting how we do computing at the University. You can no longer say “ok, we’ll give you 1 million dollars of CPUs in your startup fund” to attract new faculty. You also have to add “and the necessary $700K in order to install, power, cool, and network the computers”. I get that 2/3rds from some recent discussions with people installing large amounts of computing. This is making it harder for universities to make attractive computing offers in their start-up packages. I don’t think anyone has blinked yet (the university usually swallows the cost somehow – but someone – the department? – has to pay for it in the end).
The other place this affects things is in grant applications. I’m not so familiar with the DOE, but at the NSF they always look for an in-kind contribution from the host institution with the grant proposal. Frequent agency incentives are things like reduced rate shop time to make a proposed detector. When it comes to computing it is things like supplying the infrastructure to do the computing. But as costs go up and up Universities – especially ones that don’t have super-large endowments – are having a harder and harder time not passing the buck. The question is which university will blink first and refuse to support the complete cost? And how quickly will the others follow on. And what that will mean for future compute funding for research groups.
Or perhaps a change in the way we host computing?