• 0

GIS software - how good is it at using multicore och GPU - really?


Question

Hi! 

I have a couple of questions about GIS software, which it seems are a bit hard to find answers to, so I thought that maybe someone here knows or has even tested. 

 

It's about GIS programs. 

ESRI's general ArcGIS packages and QGIS, the Open source alternative that seems to be quite good. 

 

Basically what I am wondering is: 

1. How good are theese programs at using multicore CPU's 

Can they use HT, Xeon tech, or say just 6-8 cores instead of 2 in the program? 

 

2. Anyone knowing about any GIS program that can for visualisation use GPGPU acceleration, or where a bigger and better GPU makes sense? 

 

All this in order to make the programs be responsive when dealing with switching layers and visualing, i,e not doing necessarily lot's of statictical operations. 

 

I didn't post this in the Hardware hangout because it's about the program capabilities more than what hardware is realitistic to use. That's the next questions kind of. 

 

/Regert. 

Link to comment
Share on other sites

5 answers to this question

Recommended Posts

  • 0

Used arcGIS in quad core C2Q6600 never saw four cours maxxed when switching in between layers, can't comment on GPU load though.

 

 

http://support.esri.com/es/knowledgebase/techarticles/detail/31903

 

http://blogs.esri.com/esri/apl/2010/03/30/computations-on-vector-data-using-a-gpu/

 

I've read the first article before posting, thought it didn't really answer my question since it seemed a bit dated. 

Interesting second article thought. I am looking more from a "advanced map viewer perspective" than from GIS analysis, but where it's necessary that the rendering is as fast as possible. 

Your answer is therefore interesting, because it kind of leads to the question: how fast can a GIS dataset be visualized/rendered. 

Many compare theese program to Google Earth, even thought Google Earth isn't a GIS in a strict sense. 

 

/Regert

Link to comment
Share on other sites

  • 0

As a geoarchaeology major I have used arcgis quite extensively, unfortunately I have no experience with the GPU/CPU load on it either. Just not a thing I have thought about in a classroom situation, because as long as it works and I make the grade, I don't care. 

 

However, on my home Mac Mini I have used Global Mapper and Delorme XMap (In Parallels) to georeference (Geotiffs and PDF's) old map files. This ranges from vintage quadrangles to aerial photos. I can say that with the these two programs changing the amount of GPU ram and cores dedicated to the Windows VM itself makes a huge difference in the programs themselves when manipulating and rendering these maps.

Link to comment
Share on other sites

  • 0

As a geoarchaeology major I have used arcgis quite extensively, unfortunately I have no experience with the GPU/CPU load on it either. Just not a thing I have thought about in a classroom situation, because as long as it works and I make the grade, I don't care. 

 

However, on my home Mac Mini I have used Global Mapper and Delorme XMap (In Parallels) to georeference (Geotiffs and PDF's) old map files. This ranges from vintage quadrangles to aerial photos. I can say that with the these two programs changing the amount of GPU ram and cores dedicated to the Windows VM itself makes a huge difference in the programs themselves when manipulating and rendering these maps.

 

OK. That sounds quite interesting. Even if anecdotal, but how big difference was it? The problems with theese questions is that they heavily depend on lot's of really detailed technical features. 

Link to comment
Share on other sites

  • 0

Multicore in GIS is getting better, but it's not there yet. ESRI's ArcGIS 10 was the first release that actually bothered addressing multiple cores/CPUs. 10.1 and 10.2 do an *ok* job - they'll use extra cores for render/UI threads, but the real meat of the question is in the geoprocessing tools, and this is where it's hit and miss. Some have been rewritten to efficiently parallelise, but some haven't. It's not a trivial thing to do, given the relative complexity of some of the tools.

 

I'm not aware of any tools that actually use GPGPU built in. ArcGIS has a fairly good Python scripting environment - if you write your own stuff, this might help. Bit of ArcPy with some PyCuda might get your some good results. ArcGIS Pro is coming in 10.3 which is nicely 64-bit, properly hardware accelerated, and truly multithreaded. But it's not intended to replace ArcView, instead complement it. Like I said, it's getting better, but we're not there yet. Right now, if you want a computer to best handle GIS workloads, you're going to want the best single-threaded performance CPU you can find, a fair amount of RAM, and big, fast, SSDs.

 

And as for responsiveness, be prepared for ArcView to crash arbitrarily at the drop of a hat. Save often.

 

EDIT - Probably worth mentioning QGIS as well as while ESRI are the Microsoft Office of GIS, QGIS has come leaps and bounds recently. QGIS 2.4 added multicore rendering so it will divide map rendering across cores quite nicely - when I tried this it did better, but still choked on some larger datasets. Not sure how well its analysis stuff uses more cores.

Link to comment
Share on other sites

This topic is now closed to further replies.