Tag: article

  • HTC computing enables knee research

    High-throughput computing plays pivotal role in knee biomechanics research

  • HIV elite controllers

    High-throughput computing, HIV and the mystery of ‘elite controllers’

  • Microbial communities and rapid evolution

    Microbial communities may inform understanding of rapid evolution Lauren Michael, a research computing facilitator at the Center for High Throughput Computing (CHTC), says researchers tend to use the CHTC resources for two reasons: they have massive volumes of data or a need for incredibly complex computations. Bontrager has a bit of both, and uses the…

  • Using the Grid to Understand Crops

    UW botanist harnesses the grid to illuminate crop growth Spalding’s Birge Hall lab is pretty far removed from conventional dirt-under-the-nails botany. In one room, a row of stationary cameras bathed in red light train in on Petri dishes that contain seedlings growing in a gel. These cameras take pictures every two minutes, producing rich time-lapse…

  • Facilitating data transfer

    Genomics research is one of the largest drivers in generating Big Data for science, with the potential to equal, if not surpass, the data output of the particle physics community. Like physicists, university-based life-science researchers must collaborate with counterparts and access data repositories across the nation and globe. The National Center for Biotechnology Information (NCBI)…

  • Big data and Alzheimer’s detection

    Campus big data project may point the way to Alzheimer’s early detection The team uses the Center for High Throughput Computing (CHTC), supported by UW-Madison and the Morgridge Institue, to segment MRI brain images that in some cases can take 12-24 hours for a single brain. “For many hundreds of brain scans, we need a…

  • Connecting researchers and compute resources

    Advanced Computing Initiative helps UW-Madison researchers sift and winnow data The Advanced Computing Initiative (ACI) links researchers and computing resources to maximize productivity.

  • Big data and astronomy

    Automation offers big solution to big data in astronomy

  • Evaluating post-Katrina rebuilding grants

    Using high throughput computing to evaluate post-Katrina rebuilding grants