Thanks:  0
Likes:  0
Dislikes:  0

# Thread: Has anyone performance tested a GIS application???

1. ## Has anyone performance tested a GIS application???

Hi,

Has anyone performance tested a GIS application before.
I am in a project to do performance analisys and testing of a GIS application.
Would appreciate some inputs/advice on what all would need to be considered while testing such application.

Regards,
IGA

2. ## Re: Has anyone performance tested a GIS application???

Yes.

Same models apply for building a performance load model as does with any other app. Your development model may change depending upon whether your GIS application is a two tier thick client, web based or web services based, but that's about it.

'Pulley

3. ## Re: Has anyone performance tested a GIS application???

Most of my work involves mapping and modelling software, although typically not GIS in the strictest sense of the term. Performance on these systems tends to fall into two areas;

1 - Communication, or how the large amounts of data involved are passed through whatever conduit used. GIS systems are often dealing with huge amounts of data, and this can be a major consideration. Never, as posted here, underestimate the bandwidth of a truck load of storage devices driving down the road. When moving large geographical datasets, we tend to use removable NAS devices and a courier.

2 - Computation. It is worth remembering that the default order of complexity for most 1D systems starts at n, 2D systems its n-squared, and 3D systems its n-cubed. For this reason, when dealing with any systems working with 2d or 3d computations it is vital to throw large amounts of data at them to see how they hold up. A small performance bug in a 3d algorithm can lead to the system simply stopping forever. For example, say I have a routine with an underlying complexity of n^3 and it takes 1 minute to process 1,000 elements, how long will it take for 2,000, 10,000 and 100,000. Lets have a look see;

n = 1,000 n^3 = 1,000,000,000 operations in 1 minute = 16,666,666 operations per second

n = 2,000 n^3 = 8,000,0000,000, 8 minutes (slow but bearable)

n = 10,000 n^3 = 1,000,000,000,000 = 1,000 minutes = 16.6 hours (getting a bit painful here)

n = 100,000 n^3 = 1,000,000,000,000,000 = 694 days or just under 2 years (Housten, we have a problem)

Faster computers or more storage won't help you much here, as they are linear factors applied to a non-linear problem. Most modern GIS systems are written with this in mind and perform remarkably well under stress; many 3rd party bolt ons have major performance issues insofar as they simply don't scale.

4. ## Re: Has anyone performance tested a GIS application???

Just to add to the above, the size of many geographical data sets is also growing hugely at present, typically due to automated data collection processes such as IFSAR, LIDAR and similar point cloud collection techniques. Ten years ago, 100,000 points was considered a big model. Today 10,000,000 points is common place. This often simply breaks older GIS and CAD systems for the reasons given previously.

FWIW, the entire data set used for terrain modelling by google earth is available a free download here. If you fancy a challenge and really want to check the performance of your GIS, try throwing this data at it [img]/images/graemlins/smile.gif[/img]

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•
Search Engine Optimisation provided by DragonByte SEO v2.0.36 (Pro) - vBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.