An H100 is a seven-hundred-watt heater that does maths on the side. A B200 is a twelve-hundred-watt one. A GB200 superchip is closer to twenty-seven hundred. The compute is real — the bits flip, the model runs, the answer arrives — but the energy that performs the work is dissipated almost entirely as heat into the coolant.
Every authoritative cooling design source treats this as fact. One kilowatt of IT power equals one kilowatt of heat. The cooling system has to carry every watt away. The interesting question is not whether the chip is a heater. It is what the building does with the heat once it has it.
For thirty years the industry's answer has been: nothing. We sited buildings where land was cheap, where water was abundant, where the heat could be rejected to the air without anyone noticing.
MicroLink Data Centers was founded in 2024 to put the heat somewhere useful. We deploy high-density liquid-cooled compute behind the meter at industrial host sites — wastewater treatment plants, breweries, food manufacturers, district heating networks — and we capture the heat the chips make for the host's thermal processes.
The interesting engineering number is the capture rate. Direct-to-chip cooling on CPUs and GPUs alone catches 70–80% of the heat before it touches the room. NVIDIA's GB200 NVL72 reference deployments run at 85–90%. Full-coverage cold plates approach 98%. That capture rate is what determines whether the heat is useful to a host.
A new way to think about the data centre follows. It is a node on the city's energy matrix. Power flows in. Compute flows out. Heat flows back to a host that already needs it.
Buildings a city wants more of. Buildings a mayor names. Buildings a school takes a class to. Buildings the brewery next door says made their year.
The model rests on a sentence we treat as a contract: hosts are partners, not customers. The host's operations should not be one dollar worse off because we are there. Every loop in our system has a fallback. The heat is given, not sold.
It is more expensive to build this way. It is the only way that compounds. The data centres the industry is putting up today will be torn down in fifteen years because nothing about them earns the right to stay.
We build for forty years. The compute inside the building will turn over many times. The building will not.
In the highest-demand cities the constraint isn't capital. It's power, approvals, and time. We solve all three with the same architecture.
Industrial hosts already hold the megawatts. We deploy behind their meter, sidestepping the multi-year interconnection queues that have stalled greenfield builds. Time-to-power is measured in months, not years.
A data centre that lowers a wastewater plant's gas bill is a data centre a city defends, not one it permits reluctantly. Permitting timelines compress. The host is our co-applicant.
No greenfield site work. No standalone substation. No campus shell. We deploy into existing industrial geometry. The cost ledger is two-sided — heat off-take adds revenue or offsets electricity.
Pre-release silicon validation, AI training, low-latency inference. Demand is concentrated in the same cities where new builds are stuck. We deploy inside those cities. Adjacency is its own product.
The industry's question is "where can we get power." Ours is "where is the heat needed." The first is a constraint. The second is a strategy.
This is Edition 01. Each tab is paired with one of the four cities — Seattle anchors Philosophy, San José anchors Technology, Chicago anchors Design, New York anchors Product.
Photography is in commissioning. Image slots throughout the edition document the brief. Finished imagery replaces them in subsequent releases.
If a building is going to be welcomed, the welcome has to be reciprocal.
The textile mill. The power station. The telephone exchange. Each one appeared ordinary at first, a brick shell humming with unfamiliar machinery, anchored to a river or a rail line for reasons the neighbours didn't fully understand. Only later, when the world had rearranged itself around what happened inside, did anyone think to call it infrastructure.
The data center is that building now. And like every building that came before it, it is being forced to evolve, not by the ambitions of technologists, but by the physics of the planet it sits on.
The first generation was simple. A room. A rack. A floor tile with cold air pushing through it. The machines were modest, the heat was manageable, and the electricity bill was someone else's problem. That era is gone. What replaced it — the campus, the sealed warehouse on the edge of a desert or a river — solved the problem of scale but created a new one: separation. The data center became a world apart.
Consider what these buildings actually hold. Not servers. Not silicon. Us. Our medical records, our memories, our messages to the people we love, the collective knowledge of every institution we have built. Data centers are the libraries of our civilisation. They are the places where we store everything we know about ourselves and each other. But unlike the libraries that came before them, nobody can walk up to the front door. There is no reading room. There is no community that claims them. They were designed, from the first blueprint, to exist without neighbours.
Maybe that was the problem.
For a generation, we convinced ourselves that infrastructure no longer mattered, that the future was software, weightless and everywhere. But software runs on something. It runs on buildings, on power, on cooling, on land. We have come full circle. The era of infrastructure is back, and this time we cannot afford to build it the same way — behind fences, beyond city limits, disconnected from the communities whose lives it contains.
Because the next generation of silicon — the chips already in fabrication, already named, already allocated — will produce thermal densities that make air cooling a memory. A single rack will reject more heat than a city block of apartments consumes. And the question that no campus operator, no colo provider, no government minister has adequately answered is not where do we put the machines.
It is: what do we do with the heat?
That question changes everything. It changes the shape of the building. It changes who the neighbours are. It changes what a data center is.
This is the story of that change.
If we accept that data centers are the libraries of our civilisation, the architectural question becomes immediate: how do you build a library that nobody is allowed to enter? You cannot. The form falls apart. A library demands a public face, a reading room, a community that claims it. Without those, it is just a warehouse with rules.
The same is true here. A data center that hides behind a fence, that exports its waste heat to the sky and its decision-making to a head office in another country, is not infrastructure in any honest sense. It is a tax on the place it sits in. The host has to be a partner. The civic interface has to be real. The relationship has to be designed in from the first blueprint.
Three things follow. The host's operations should not be one dollar worse off because we are there. Every loop in our system has a fallback — if the host loses power our compute does not go down, if our compute goes down the host's heat does not stop. And the heat is given, not sold. The metering exists for engineering reasons, not for invoicing.
If you charge your partner for the favour you said you were doing them, they are no longer your partner.
The Carnegie libraries opened between 1883 and 1929. Most are still in use. We design for forty years.
Tall fenestration. A screen showing both ledgers in real time. Open by appointment, and on civic occasions.
Annually, every middle and high school in the host city is invited. Designed by an educator, not a marketing team.
PUE, ERE, gas displaced, hours of community use. No qualifications, no asterisks, no marketing prose.
A person, not an inbox. The community knows who to call. They answer the phone.
Public, modest, real. The host operator and data-centre operator press the start button together. The mayor is asked to speak.
A written commitment to the host city. Reviewed by the city. Published. The architecture is the easy part. The covenant is the durable part.
If a building is going to be welcomed, the welcome has to be reciprocal.
San José is the exemplar for the technology chapter. The same three-loop architecture, the same dry-cooler fallback, the same hydraulic separation runs across every site in the portfolio.
The Regional Wastewater Facility sits at the south end of San Francisco Bay, fifteen minutes from NVIDIA's headquarters. The site has the rarest combination in American infrastructure — it is enormous, it is beautiful, and it is quiet.
It also sits inside the densest concentration of pre-release silicon engineering in the world. That is not a coincidence we will let pass.
San José is built around a single intent: be the place NVIDIA validates pre-release silicon. The campus exists to do work that cannot be done anywhere else.
An H100 is a 700-watt heater. A B200 is a 1,200-watt one. A GB200 superchip is closer to 2,700 watts. The cooling system has to carry every watt away. The interesting question is what fraction reaches the loop before it touches the room.
A modern AI accelerator does no mechanical work, stores no significant energy at steady state, and the only physical exits for energy other than heat are radio waves and photons through optical fibre. Both are bounded by physics or by regulation to fractions that round to zero in any cooling calculation. The defensible engineering convention, codified by Schneider Electric's Calculating Total Cooling Requirements for Data Centers and adopted across ASHRAE and OCP, is that essentially all of the electrical power consumed by IT equipment is converted to heat.
This is the thermodynamic case for liquid cooling. Air-cooled racks reject heat to the room, where it has to be moved out by mechanical air handlers running at multiples of the IT load's energy. Liquid-cooled racks reject heat directly to the coolant, where it can be moved at one-tenth the parasitic power, and where it can be delivered to a host at a temperature high enough to be useful.
The same physics that makes liquid cooling efficient is what makes heat recovery possible.
A direct-to-chip cold plate on CPUs and GPUs alone catches roughly three quarters. NVIDIA's GB200 NVL72 reference deployment captures around 87%. Full-coverage cold plates approach 98%. The unrecovered residual is heat that still has to leave the room.
MicroLink designs against the GB200 NVL72 deployment range. Eighty-five to ninety percent of the IT load lands in the loop and is available to the host. The remaining ten to fifteen percent is heat that has to leave the room — through CRACs, through rear-door heat exchangers, through engineered air handling. We do not pretend that residual is zero. We engineer the air side as carefully as the water side.
The capture rate matters because it sets the upper bound on what the host can use. If the loop catches 87% of a 10 MW deployment, the host has access to 8.7 MW of continuous low-grade heat. At a wastewater plant, that displaces a large fraction of the gas the host currently burns. The chip is the heater. The loop is the question. The host is the answer.
Three loops are the price of partnership. A single-loop system is cheaper, smaller, simpler. It is also a system in which our coolant chemistry is mixed with the host's process water. That is unacceptable for two reasons.
The first is regulatory. A wastewater plant cannot have unknown fluids enter its digester chemistry. The host loses its operating licence the moment our chemistry crosses into theirs.
The second is trust. The hydraulic separation of the loops is the physical expression of the partnership.
Two numbers, both reported quarterly, both public. The mayor's question, answered in figures.
A site can have an excellent PUE and a terrible ERE, and still be wasting energy. A site can have a moderate PUE and an exceptional ERE, and still be doing better work for the city.
A new instrument has been pointed at the city.
Stickney processes 1.4 billion gallons a day, serving 2.4 million people across Chicago and surrounding municipalities. The relationship with MWRD is live. The meeting is done. The work is mapped.
Stickney is the keystone of the Great Lakes consortium narrative. Twelve major wastewater plants ring the lakes. Cleveland, Milwaukee, Detroit, Toronto, Buffalo, Rochester. Each one is a candidate.
If it works at the largest plant in the world, it works at the smaller ones. The design discipline is replicability without monotony. The same skid, dropped into different geometries, with the same architectural register at every scale.
The cities that received Carnegie libraries did not know, at the time, that they were getting the most lasting piece of civic infrastructure they would build that century. The buildings outlived their funder, their architects, the technologies they housed, and the generation they were built for.
They are still in use because they were welcomed.
The data centres the industry is building now will, in most cases, not last forty years. They will be ripped down and replaced. They were designed to be invisible, and the easiest thing to do with an invisible building is forget it.
We design for forty years. We design buildings that the city wants to keep. The architecture is durable. The civic interface is real. The host relationship is structured to last.
At Stickney, the architectural register is Midwest industrial — brick, limestone, standing-seam metal, steel-framed industrial windows. Massing that speaks to the rail yards and the canal, not to the suburb.
Tall fenestration onto the plant. A single screen showing the dual ledger — our load and the host's load — in real time. A meeting table. A whiteboard. The room is shared. The MicroLink engineer and the plant operator work in the same space. The architecture insists they are colleagues, not counterparties.
A street-facing or path-facing elevation, intentionally open. A small civic space. A signboard explaining what the building does. The public ledger displayed where anyone can see it. The building is not hidden. It is foregrounded.
The skid is the difficult engineering. A pre-fabricated mechanical and electrical package — heat exchangers, pumps, valves, controls — that takes our 45°C output and hands it cleanly to the host's existing thermal main.
It is factory-built and factory-tested. It arrives by truck. It is lifted onto a poured concrete pad and connected in days. The pod is the easy part. The skid is the careful part.
At Stickney the skid is the same skid we drop into Cleveland, Milwaukee, Detroit. The geometry of each plant is unique. The interface is not.
If it works at the largest plant in the world, it works at the smaller ones.
Newtown Creek is the New York City Department of Environmental Protection's flagship plant — 1.3 billion litres a day, the eggs visible from the BQE, an existing CHP system burning digester biogas, an existing RNG arrangement with National Grid, an existing Con Edison interconnection.
Every piece of infrastructure we need is already there. The product slots in beside it.
The eggs were designed by the Polshek Partnership and finished in 2010. They are listed in architectural guides to New York. A data centre that lands next to them has to be photographed alongside them without embarrassment. The product was designed for that test.
The pod is designed to be moved. It arrives on a flatbed, is lifted onto a poured concrete pad, and is connected to the skid in days.
The 40-foot compute container holds eight cold-plate-cooled racks. The 20-foot mechanical container holds the CDU, the heat exchanger, the pumps, the controls. Two containers, one unit. The deployment atom of MicroLink.
90 days from order to commissioning. That number is what the partnership runs on. A host that says yes in March is in operation by midsummer. The architecture of the deal is the architecture of the timeline.
This is where speed compounds against the rest of the industry. A greenfield campus takes three years. The pod takes ninety days because the heavy lifting was done in the factory, not on the site.
The matte slate exterior, the careful proportions, the lack of corporate decals — all of it exists because of where the building has to stand.
The architectural register at Newtown Creek is Brooklyn waterfront — matte slate, weathering steel, glass at the public edge, opaque at the back of house. Civic in proportion to the digester eggs across the path.
The product is not a single design that gets repeated everywhere. It is a kit of parts that takes its colours, its scale, and its setbacks from the place it lands in.
The building belongs to its place. That is the architectural rule the product is designed to obey.
A building that lands next to the eggs has to be photographed alongside them without embarrassment.