soup4you2
April 28th, 2004, 20:54
Ok i've been given a task to take 5 servers and have various information plotted to a graph on a single webpage. so we can see performance related stuff.. these checks incude:

-Graphs of disk usage / free space
-Graphs of CPU Load vs User processes load
-Graphs on memmory
-Graphs on how long it takes to load various websites we host
-Etc.....

So i've pretty much gotten it all working beautifully except for how long it takes to load a webpage..

i've written some shell scripts that download the page w/ wget and use the time command to determine how long it takes. so now i get a output of:

0.70
0.70
0
0

I'm read this is what MRTG wants.. however it does not like the decimal at all.. and i'm not getting nothing plotted on the graph unless it's over a second.. this is not good enough for us. so i've started to think MRTG sucks and RRDTOOL (it's sucessor) is much better.. however havent had the time to read up on it too much yet.. would anybody know if a simpler way of doing this?

frisco
April 28th, 2004, 23:58
Multiply the results by 10 (or 100) and change the data label to "tenths of a second" (or "hundredths of a second").

Worst case, perl's GD::Graph isn't that tough to figure out.

soup4you2
April 30th, 2004, 17:57
thanks... worked great.....