Australia's most powerful supercomputer, Raijin, unveiled

Australia’s most powerful supercomputer, Raijin, made its debut today. The launch was timed to coincide with the opening of the National Computational Infrastructure; a high performance computing centre.

For those familiar with Japanese mythology, Raijin was the god of thunder, lighting and storms. Such a name needs a worthy computer, and for researchers handling huge amounts of data, it is deserving.

According to Australian press, it can perform the same number of calculations in one hour as seven billion people with calculators could… in twenty years. We’ve come a long way; the first computer to be measured in Floating Operations per Second was CDC-6600, in 1964.

It’s capable of running at 1.2 petaflops, meaning it’s not the world’s most powerful supercomputer – that’s Tianhe-2, at 33.86 petaflops. Even so, Raijin’s not your conventional desktop either.

The cost to make Raijin a reality isn’t quite clear yet, though the NCI is aided by a number of partner organizations. Among these partners, there’s another $50,000,000 (AUD) for the next four years. So it’s a tiny bit pricier than an i7.

Source: Australian National University | Image via India Times

Report a problem with article
Previous Story

Report: Xbox One can be powered for 10 years; software is almost done

Next Story

Facebook starts rolling out embedded posts

15 Comments

Commenting is disabled on this article.

I see at least two applications: AES cracking of commercial cables and management of "full take" Internet dump.

Shahrad said,
Butt he real question is: Can it download torrent files whilst running crysis on battery?

With enough solar panels anything is possible.

Or maybe the crysis engine is so powerful it can emulate supercomputers at full speed so the question all along was: Can crysis run this supercomputer?

68k said,
No, because it has onboard graphics.

I'm sure the CPU power this is capable of is more than enough to do a dedicated GPUs job (if you didn't know CPUs can do graphics too, GPUs are just far better at it).