The Nice Flip: How Accelerated Computing Redefined Scientific Methods — and What Comes Subsequent


It was once that computing energy trickled down from hulking supercomputers to the chips in our pockets.

Over the previous 15 years, innovation has modified course: GPUs, born from gaming and scaled by accelerated computing, have surged upstream to remake supercomputing and carry the AI revolution to scientific computing’s most rarefied methods.


JUPITER at Forschungszentrum Jülich is the symbol of this new period.

Not solely is it among the many best supercomputers — producing 63.3 gigaflops per watt — however it’s additionally a powerhouse for AI, delivering 116 AI exaflops, up from 92 at ISC Excessive Efficiency 2025.

That is the “flip” in motion. In 2019, practically 70% of the TOP100 high-performance computing methods had been CPU-only. Right this moment, that quantity has plunged under 15%, with 88 of the TOP100 methods accelerated — and 80% of these powered by NVIDIA GPUs.

Throughout the broader TOP500, 388 methods, 78%, now use NVIDIA know-how, together with 218 GPU-accelerated methods (up 34 methods 12 months over 12 months) and 362 methods linked by high-performance NVIDIA networking. The development is unmistakable: accelerated computing has turn into the usual.

However the true revolution is in AI efficiency. With architectures like NVIDIA Hopper and Blackwell and methods like JUPITER, researchers now have entry to orders of magnitude extra AI compute than ever.

AI FLOPS have turn into the brand new yardstick, enabling breakthroughs in local weather modeling, drug discovery and quantum simulation — issues that demand each scale and effectivity.

At SC16, years earlier than right this moment’s generative AI wave, NVIDIA founder and CEO Jensen Huang noticed what was coming. He predicted that AI would quickly reshape the world’s strongest computing methods.

“A number of years in the past, deep studying got here alongside, like Thor’s hammer falling from the sky, and gave us an extremely highly effective software to resolve among the most troublesome issues on the planet,” Huang declared.

At SC16, Huang defined how AI would reshape the world’s strongest scientific computing methods.

The mathematics behind computing energy consumption had already made the shift to GPUs inevitable.

But it surely was the AI revolution, ignited by the NVIDIA CUDA-X computing platform constructed on these GPUs, that prolonged the capabilities of those machines dramatically.

Immediately, supercomputers may ship significant science at double precision (FP64) in addition to at blended precision (FP32, FP16) and even at ultra-efficient codecs like INT8 and past — the spine of recent AI.

This flexibility allowed researchers to stretch energy budgets additional than ever to run bigger, extra complicated simulations and prepare deeper neural networks, all whereas maximizing efficiency per watt.

However even earlier than AI took maintain, the uncooked numbers had already pressured the difficulty. Energy budgets don’t negotiate. Supercomputer researchers — inside NVIDIA and throughout the group — had been coming to grips with the highway forward, and it was paved with GPUs.

To succeed in exascale and not using a Hoover Dam‑sized electrical invoice, researchers wanted acceleration. GPUs delivered way more operations per watt than CPUs. That was the pre‑AI inform of what was to return, and that’s why when the AI growth hit, large-scale GPU methods already had momentum.

The seeds had been planted with Titan in 2012 on the Oak Ridge Nationwide Laboratory, one of many first main U.S. methods to pair CPUs with GPUs at unprecedented scale — displaying how hierarchical parallelism may unlock big software features. 

In Europe in 2013, Piz Daint set a brand new bar for each efficiency and effectivity, then proved the purpose the place it issues: actual functions like COSMO forecasting for climate prediction.

By 2017, the inflection was plain. Summit at Oak Ridge Nationwide Laboratory and Sierra at Lawrence Livermore Laboratory ushered in a brand new customary for management‑class methods: acceleration first. They didn’t simply run sooner; they modified the questions science may ask for local weather modeling, genomics, supplies and extra.

These methods are capable of do far more with a lot much less. On the Green500 record of probably the most environment friendly methods, the highest eight are NVIDIA‑accelerated, with NVIDIA Quantum InfiniBand connecting 7 of the Prime 10.

However the story behind these headline numbers is how AI capabilities have turn into the yardstick: JUPITER delivers 116 AI exaflops alongside 1 EF FP64 — a transparent sign of how science now blends simulation and AI.
Energy effectivity didn’t simply make exascale attainable; it made AI at exascale sensible. And as soon as science had AI at scale, the curve bent sharply upward.

What It Means Subsequent

This isn’t nearly benchmarks. It’s about actual science:

  • Sooner, extra correct climate and local weather fashions
  • Breakthroughs in drug discovery and genomics
  • Simulations of fusion reactors and quantum methods
  • New frontiers in AI-driven analysis throughout each self-discipline

The shift began as a power-efficiency crucial, turned an architectural benefit and has matured right into a scientific superpower: simulation and AI, collectively, at unprecedented scale.

It begins with scientific computing. Now, the remainder of computing will comply with.



Supply hyperlink

Leave a Reply

Your email address will not be published. Required fields are marked *

news-1701

sabung ayam online

yakinjp

yakinjp

rtp yakinjp

slot thailand

yakinjp

yakinjp

yakin jp

yakinjp id

maujp

maujp

maujp

maujp

sabung ayam online

sabung ayam online

judi bola online

sabung ayam online

judi bola online

slot mahjong ways

slot mahjong

sabung ayam online

judi bola

live casino

sabung ayam online

judi bola

live casino

SGP Pools

slot mahjong

sabung ayam online

slot mahjong

SLOT THAILAND

article 138000556

article 138000557

article 138000558

article 138000559

article 138000560

article 138000561

article 138000562

article 138000563

article 138000564

article 138000565

article 138000566

article 138000567

article 138000568

article 138000569

article 138000570

article 138000571

article 138000572

article 138000573

article 138000574

article 138000575

article 138000576

article 138000577

article 138000578

article 138000579

article 138000580

article 138000581

article 138000582

article 138000583

article 138000584

article 138000585

article 138000586

article 138000587

article 138000588

article 138000589

article 138000590

article 138000591

article 138000592

article 138000593

article 138000594

article 138000595

article 138000596

article 138000597

article 138000598

article 138000599

article 138000600

article 138000601

article 138000602

article 138000603

article 138000604

article 138000605

article 138000606

article 138000607

article 138000608

article 138000609

article 138000610

article 138000611

article 138000612

article 138000613

article 138000614

article 138000615

article 208000451

article 208000452

article 208000453

article 208000454

article 208000455

article 208000456

article 208000457

article 208000458

article 208000459

article 208000460

article 208000461

article 208000462

article 208000463

article 208000464

article 208000465

article 208000466

article 208000467

article 208000468

article 208000469

article 208000470

208000446

208000447

208000448

208000449

208000450

208000451

208000452

208000453

208000454

208000455

article 228000306

article 228000307

article 228000308

article 228000309

article 228000310

article 228000311

article 228000312

article 228000313

article 228000314

article 228000315

article 228000316

article 228000317

article 228000318

article 228000319

article 228000320

article 228000321

article 228000322

article 228000323

article 228000324

article 228000325

article 228000326

article 228000327

article 228000328

article 228000329

article 228000330

article 228000331

article 228000332

article 228000333

article 228000334

article 228000335

article 238000281

article 238000282

article 238000283

article 238000284

article 238000285

article 238000286

article 238000287

article 238000288

article 238000289

article 238000290

article 238000291

article 238000292

article 238000293

article 238000294

article 238000295

article 238000296

article 238000297

article 238000298

article 238000299

article 238000300

article 238000301

article 238000302

article 238000303

article 238000304

article 238000305

article 238000306

article 238000307

article 238000308

article 238000309

article 238000310

article 238000311

article 238000312

article 238000313

article 238000314

article 238000315

article 238000316

article 238000317

article 238000318

article 238000319

article 238000320

sumbar-238000256

sumbar-238000257

sumbar-238000258

sumbar-238000259

sumbar-238000260

sumbar-238000261

sumbar-238000262

sumbar-238000263

sumbar-238000264

sumbar-238000265

sumbar-238000266

sumbar-238000267

sumbar-238000268

sumbar-238000269

sumbar-238000270

sumbar-238000271

sumbar-238000272

sumbar-238000273

sumbar-238000274

sumbar-238000275

sumbar-238000276

sumbar-238000277

sumbar-238000278

sumbar-238000279

sumbar-238000280

sumbar-238000281

sumbar-238000282

sumbar-238000283

sumbar-238000284

sumbar-238000285

sumbar-238000286

sumbar-238000287

sumbar-238000288

sumbar-238000289

sumbar-238000290

sumbar-238000291

sumbar-238000292

sumbar-238000293

sumbar-238000294

sumbar-238000295

sumbar-238000296

sumbar-238000297

sumbar-238000298

sumbar-238000299

sumbar-238000300

sumbar-238000301

sumbar-238000302

sumbar-238000303

sumbar-238000304

sumbar-238000305

sumbar-238000306

sumbar-238000307

sumbar-238000308

sumbar-238000309

sumbar-238000310

sumbar-238000311

sumbar-238000312

sumbar-238000313

sumbar-238000314

sumbar-238000315

sumbar-238000316

sumbar-238000317

sumbar-238000318

sumbar-238000319

sumbar-238000320

news-1701