UF’s supercomputer helps scientists crunch the massive amounts of data it takes to understand the impact of climate change

Somewhere between the record-setting warm November and the 80-degree Christmas day that had people scrambling for the beach, you’ve probably heard the offhand comment about global warming. Of course, while it’s actually an unusually strong El Niño event that we have to thank for the heat, the weird weather has made climate change a pretty hot topic.

It’s one of the major questions on researchers’ minds: How does the environment — for example, forests or the ocean — respond to “global change drivers” such as climate change, ozone depletion and pollution that are altering the planet and its natural processes? Knowing these relationships helps us better understand the effect humans have on the biosphere and Earth’s ecological future.

fineprintsupercomputer_1024

Illustration by Elizabeth Rhodes

It turns out, however, that those questions are pretty difficult to answer without some heavy computational help.

Forest ecologists and other researchers who study the effects of climate change have a massive trove of data to draw from, including direct scientific observations that may date back centuries, and “proxy” data (such as ice core samples) that can be used to estimate climate conditions on time scales reaching back millions of years.

The issue is what to do with that information. The biggest challenges come from the intense computational powers the modeling requires, said University of Florida biology assistant professor Jeremy Lichstein, Ph.D.

“Like most ecologists, we’re very much dealing with the challenges of ‘big data,’” he said.

That’s where the new UF mega-cluster supercomputer, HiPerGator 2.0, steps in.

The HiPerGator 2.0 supercomputer is rolling out this spring to well-documented accolades and fanfare. And this boon to Gainesville’s brain trust has the potential to answer important questions – not just on campus, but worldwide.

“Basically, it is a bunch of regular computers put together into racks,” said Matt Gitzendanner, Ph.D., a bioinformatics specialist at the UF High-Performance Computing & Simulation Research Laboratory (UF HCS Lab).

While each computer alone may have humble capacities, the real beauty of a supercomputer is that it combines “clusters” of processor cores — the workhorses of computing — that communicate seamlessly with each other, reaching speeds that have been compared to 600 Playstation 4s blazing at maximum capacity.

Lichstein and his colleagues at UF and abroad are currently collecting data from direct measurement of “plant functional traits” such as nitrogen and phosphorus, two important elements for plant growth in leaf fibers. They are coupling that with U.S. government and other agencies’ data on forest growth across the world to make projections on how ecosystems respond to climate change.

From there, researchers turn to the supercomputers. All these massive troves of observations are used with computer models, which simulate natural processes to better understand which plants may be most competitive under different climate change possibilities. Computer models such as these are incredibly intensive and can require up to hundreds of processors functioning at once, depending on the size of the data pool input. And data they spit out are on the scale of terabytes (TB), making them even harder to manage. That means that some of the output files are about 62 times larger than the storage capacity of most smartphones.

Lichstein is not alone at UF in this important application of high-performance researching computing. In 2014, UF biological engineering researcher Senthold Asseng, Ph.D., co-authored a paper that used computer models to analyze yields of globally important crops such as wheat under different climate projections. Similarly, Andrea Dutton, Ph.D. at the UF Department of Geological Sciences, has been using computer models to get an idea of what Earth’s oceans may look like in the future as glaciers shrink.

HiPerGator 2.0 is actually an expansion of HiPerGator 1.0, itself an expansion on older technology at UF. HiPerGator 2.0 adds close to 30,000 new cores to the existing 21,000 in the first model, 1 petabyte (PB) of disk storage (that’s 1,000,000 GB) to the existing 2 PB, and 120 of GB RAM to the previous 64. The entire  system is housed cozily in UF’s East Campus in a climate-controlled space.

When HiPerGator 1.0 became operational in 2013, Gitzendanner said, UF researchers in biological science fields had, for the first time, a dedicated system for working on projects as diverse as drug outcome probability, genomic research on human or pathogen genes and ecological modelling.

All this is not a luxury afforded to every researcher in universities and institutions nationwide. Elsewhere, researchers may have access to cloud-based supercomputers at government labs, such as Oak Ridge National Laboratory in Tennessee, but they must deal with bureaucracy and waiting in line. Gitzendanner and colleagues at the supercomputer facility said they envision the addition of HiPerGator 2.0 assisting UF researchers in doing what they do best without all of the stop-and-go.

Already this year, HiPerGator 2.0 was recognized by TOP500, a semiannual ranking of the top supercomputers in the world, as the third most powerful supercomputer site at any U.S. public or private university. HiPerGator was also recognized in October with a Dell World 2015 Impact Award for its data-processing capabilities.

UF HSC Laboratory specialists such as Gitzendanner are prepared for the influx of even more users, who have already begun signing up for time with the new 2.0 roll-out. After all, the current HiPerGator 1.0 can already accommodate thousands of “jobs” at once. In the meantime, Gitzendanner is busy traveling from lab to lab to instruct more scientists on how to use this new resource. To scientists like Lichstein monitoring the impact of climate change, however, the value of HiPerGator is clear:

“We couldn’t do what we’re doing without big computer clusters.”