Here is the text from http://www.weidai.com/black-holes.txt ... (It's a brilliant idea?): Advanced civilizations probably have extensive cooling needs. Computing and communication equipment both work better at lower temperatures. A cooler computer means a faster computer with lower energy needs, and a cooler transceiver has lower thermal noise. Since these equipment cannot operate with perfect efficiency, they will need to eliminate waste heat. It's not too difficult to cool a system down to the temperature of the cosmic background radiation. All you need to do is build a radiator in interstellar space with a very large surface area, and connect it with the system you're trying to cool with some high thermal-conductance material. However, even at the cosmic background temperature of T=3K, erasing a bit still costs a minimum of k*T*ln 2 = 2.87e-23 J. What is needed is a way to efficiently cool a system down to near absolute zero. I think the only way to do it is with black holes. Black holes emit Hawking radiation at a temperature of T=h*c^3/(16*pi*G*M). With the mass of the sun, the temperature of a black hole would be about 10^-8 K. At this temperature, erasing a bit costs only about 10^-31 J. If you build an insulating shell outside the event horizon of a black hole, everything inside the shell would eventually cool down to the temperature of the black hole. However, it would not be necessary to build a complete shell around a black hole in order to take advantage of its low temperature. For example you can simply point the radiators of your black hole orbiter toward the black hole and insulate the side facing away from the black hole. If it's true that the only efficient way to cool material down to near absolute zero is with black holes, we should expect all sufficiently advanced civilizations to live near them. However this prediction may be difficult to test since they would have virtually no radiation signatures. ----end. WDS comment: the trouble with Wei Dai's idea is: black holes are small. If you only are willing to live near them, then you miss out on an enormous fraction of all real estate. Also, at the present moment in the history of the universe and of galaxies, there is plenty of energy available from, e.g, stars, unfused hydrogen, etc. I happen to believe black holes will become very important in the future for "life," but right now it seems like other places seem more fun. And even if they are not, then one still would have to ask: why refuse to exploit the ecological niche of all the non-black-hole real estate in those galaxies? In view of this, the recent astronomy study I posted about, claiming failure to see any galaxies that had been taken over by ultra-competent life, is, if valid, still rather meaningful. Also, another problem. Wei Dai is saying a 6 solar mass black hole is a heat sink at 10^(-8) kelvin, fine for radiating your waste heat into so you can now, e.g, perform computations at energy cost of around kB*T per bit, where T=10^(-8) kelvin and kB=Boltzmann constant. As opposed to, if we radiated into the universe at 3 kelvin. A huge win. Sounds great at first, but a minute later, you realize that's totally lame. The trouble is the RATE at which you can radiate waste heat from a 10 nanokelvin radiator, is pathetically small. Power = Area*sigmaSB*T^4 where sigmaSB=Stefan Boltzmann constant. If we use area=4*10^9 square meters (total horizon area) and T=10 nanokelvins we get power=2*10^(-30) watts. This would enable you to perform Area*(sigmaSB/kB)*T^3 bit operations per second if each cost kB*T, i.e. about 16 per second. Whoo whee. Yah, that's some super intelligent & competent lifeform there. Unfortunately, at the present time it would be massively outcompeted by other lifeforms using a different approach.