*sorry for the repost - I've only had 5 people in a week look at this on the performance testing forum*
I work for a large financial company in the midwest that has moved to an integrated services environment. We have a variety of *nix servers (AIX, HP UX, Linux, Solaris) in addition to your standard windows boxes. Most of our performance critical architecture runs on *nix, for obvious reasons.
As performance testers, our team is faced with a problem: when we are tasked with testing application X, we have to take into account what other applications are running on the shared hardware (apps A, B, and C share the Weblogic servers with X; D, E, and F hit the same DB, and B, C, and D share Apache - you get the idea).
It is not feasible (for reasons I will not delve into here) for us to fire up load for all of the applications on the shared infrastructure, so we are looking for other solutions. Our (less than) ideal fix would be to find a *nix based application that we can configure to artificially consume resources on a target server to simulate activity of other applications. For example, we would look at a server in production, check some key metrics (CPU Utilization, Disk I/O, available memory, etc.), and then configure the application on our test server to consume those same resources at the same rates. Does anyone have any suggestions?