Editorial

How gaming technology is being used to save lives

Here’s a riddle: When you think of a computer with a dual blazing-fast NVIDIA cards, a Xeon processor and 16GB of RAM, what type of PC comes to mind? Maybe a high-end Alienware system? Believe it or not, that description matches many of the systems being used to power today’s medical diagnostic devices such as CT scanners and ultrasound machines. In an interesting twist, the same GPU technology that renders high-resolution, imaginary destruction has found its way into the doctor’s office and is frequently used to save lives rather than virtually vaporize them.

Ten years ago, GPU manufacturers and their respective technologies were in their infancy. However, they knew the exploding game market held great promise for their products because of the need for complex graphical data processing. 

It turns out that graphics problems are best solved by highly parallelized algorithms, and unfortunately for chip makers Intel and AMD, CPUs are good at solving problems through a serial approach, not a parallel approach. 

Over the past decade, graphics-hardware vendors consolidated and developed standards like Direct X. The game industry put great effort into increasing the video quality of games and quickly made the dedicated GPU a necessity for hardcore gamers.  

An intense rivalry began among NVIDIA and ATI with the battle between the two designers becoming one of the most watched in the PC industry. The competition brought incredible innovation to the GPU market, and all of that new technology did not go unnoticed from researchers in other fields, like medical devices and biotechnology.

As game developers were pushing the limits of commercial GPUs, researchers from other domains began to realize that they could use these special processors for non-gaming tasks that resembled graphics problems. 

By translating the appropriate algorithms into graphics problems, a scientist could use a commercial GPU to decrease required compute time by orders of magnitude. At the same time, researchers in some fields were already creating proprietary processors and hardware to efficiently solve their most valuable parallel problems. The cost of such methods made parallel processing an expensive, endeavor, which was out of the reach of most industries. 

NVIDIA and ATI recognized this opportunity and general purpose parallel computing was born. They retrofitted their GPUs with instruction sets that were more general purpose in nature, and suddenly the research community could use massively-produced and much-less-expensive hardware to attack parallel problems. Companies that were building proprietary solutions began to take notice, and they quickly moved to the more cost effective (but also less optimized) general-purpose-parallel-computing products.

Today, the availability and relative low cost of GPU technology means software developers from many industries can start crunching number sets that were too large to handle before parallel processing became mainstream.  GPUs are helping to detect cancer cells in your body, create 3D diagnostic models of your anatomy, and even monitor the quality of the industrial processes that produce synthetic insulin. Hopefully you feel better knowing that the money you spent on that expensive gaming PC ten years ago helped fund the technology that is improving the quality of lives of millions.

Are there any technologies that are being applied in a way that surprises you?  If so, let’s continue the discussion in the comments.

Josh Neland is a Technology Strategist at Dell, and you can follow him on twitter at http://twitter.com/joshneland

Report a problem with article
Previous Story

Twitter to developers: Stop copying official apps

Next Story

Computer museum looking for permanent home

25 Comments

Commenting is disabled on this article.

This is great information. I work in the IT&T industry and could use this as an example in an upcoming speech.

One of the best articles/editorials I have read on Neowin.

kizzaaa said,
This is great information. I work in the IT&T industry and could use this as an example in an upcoming speech.

One of the best articles/editorials I have read on Neowin.

Glad you found it useful.

If I found out an alienware was responsible for my life i'd end it rather tan be tortured by it.

Haha, I can just see doctors in the meeting room or whatever discussing build specs and ripping out Joe bloggs because his computer is worse than all of the others haha

WiCkeD SaM said,
Presently 8GB Ram is more than enough for the hardcore gamers out there

Not if you want to run Metro at anything higher than 7FPS it isn't xD

Benjy91 said,

Not if you want to run Metro at anything higher than 7FPS it isn't xD

metro 2033? runs fine with 8gb, unless your talking about something completely different....?

some xeon cpu overclock better than there counterparts, also whos to say people that use there computers to game, dont also do video editing or use a 3d modeling programs?

Field Commander A9 said,
Some one mentioning Direct X?
Guess that CT scanner is not running Linux

A lot of medical/research/military devices used for research, assessment or testing are running a scaled down version of Windows actually. Reliability is questionable but integration is impossible.

Field Commander A9 said,
Some one mentioning Direct X?
Guess that CT scanner is not running Linux

There's no telling - many of the devices are using Linux. I only included Direct X as a reference to the software interfaces that made it easier for developers to utilize GPU technology without coding/locking into a specific graphics card.

Question, when we going to start seeing all this CUDA/OpenCL apps and stuff which ATi/nVidia were hyping about?

Tony. said,
Question, when we going to start seeing all this CUDA/OpenCL apps and stuff which ATi/nVidia were hyping about?

Tony, it's happening right now. Every single medical vendor I've talked with has a product coming out based on CUDA/nVidia. The skill set is still extremely specialized, and there just isn't the same development tool ecosystem available right now compared to x86 based stacks . . . but these things are going to change over the next couple of years.

Josh Neland said,

Tony, it's happening right now. Every single medical vendor I've talked with has a product coming out based on CUDA/nVidia. The skill set is still extremely specialized, and there just isn't the same development tool ecosystem available right now compared to x86 based stacks . . . but these things are going to change over the next couple of years.

I know what you wrote, but I'm talking about consumer based apps/products!

i.e. Having Windows take advantage of it to speed certain tasks up etc.

carmatic said,
this sounds like the GPU folding@home client to me?

+1, should be an article on it! And maybe a video of the GPU visualization of a protein being made.