How to make nagios server more efficient?

My server is a Red Hat Enterprise Linux AS release 3 (Taroon Update 4). Installed in an IBM machine dual processor Intel® Xeon™ CPU 3.60GHz.

Monitoring the following:

**Verifying NAGIOS …

Nagios 1.2
Copyright © 1999-2004 Ethan Galstad ([email protected])
Last Modified: 02-02-2004
License: GPL

Reading configuration data…

Running pre-flight check on configuration data…

Checking services…
Checked 4018 services.
Checking hosts…
Checked 4018 hosts.
Checking host groups…
Checked 128 host groups.
Checking contacts…
Checked 1 contacts.
Checking contact groups…
Checked 1 contact groups.
Checking service escalations…
Checked 0 service escalations.
Checking host group escalations…
Checked 0 host group escalations.
Checking service dependencies…
Checked 0 service dependencies.
Checking host escalations…
Checked 0 host escalations.
Checking host dependencies…
Checked 0 host dependencies.
Checking commands…
Checked 47 commands.
Checking time periods…
Checked 4 time periods.
Checking for circular paths between hosts…
Checking for circular service execution dependencies…
Checking global event handlers…
Checking obsessive compulsive service processor command…
Checking misc settings…

Total Warnings: 0
Total Errors: 0**

I dont have problems in querying. But my main problem is in the display. The loading os summary.cgi or tac.cgi makes the machine 0% idle.

Thanks in advance for the help.

:smiley:

check permissions on the sbin dir.
-rwxrwxr-x 1 nagios nagios 147324 Jun 27 15:19 avail.cgi
-rwxrwxr-x 1 nagios nagios 156444 Jun 27 15:19 cmd.cgi
-rwxrwxr-x 1 nagios nagios 117884 Jun 27 15:19 config.cgi
-rwxrwxr-x 1 nagios nagios 170844 Jun 27 15:19 extinfo.cgi
-rwxrwxr-x 1 nagios nagios 129660 Jun 27 15:19 histogram.cgi
-rwxrwxr-x 1 nagios nagios 111020 Jun 27 15:19 history.cgi
-rw-r–r-- 1 nagios nagios 110 Jun 5 14:07 .htaccess
-rwxrwxr-x 1 nagios nagios 107084 Jun 27 15:19 notifications.cgi
-rwxrwxr-x 1 nagios nagios 109116 Jun 27 15:19 outages.cgi
-rwxrwxr-x 1 nagios nagios 106572 Jun 27 15:19 showlog.cgi
-rwxrwxr-x 1 nagios nagios 154364 Jun 27 15:19 status.cgi
-rwxrwxr-x 1 nagios nagios 133884 Jun 27 15:19 statusmap.cgi
-rwxrwxr-x 1 nagios nagios 121340 Jun 27 15:19 statuswml.cgi
-rwxrwxr-x 1 nagios nagios 114780 Jun 27 15:19 statuswrl.cgi
-rwxrwxr-x 1 nagios nagios 125436 Jun 27 15:19 summary.cgi
-rwxrwxr-x 1 nagios nagios 133820 Jun 27 15:19 tac.cgi
-rwxrwxr-x 1 nagios nagios 296043 Jun 27 16:05 trends.cgi

Mine has the following:

-rwxrwxr-x 1 nagios nagiosadmin 139680 May 14 17:19 avail.cgi
-rwxrwxr-x 1 nagios nagiosadmin 140848 May 14 17:19 cmd.cgi
-rwxrwxr-x 1 nagios nagiosadmin 110560 May 14 17:19 config.cgi
-rwxrwxr-x 1 nagios nagiosadmin 153944 May 14 17:19 extinfo.cgi
-rwxrwxr-x 1 nagios nagiosadmin 122272 May 14 17:19 histogram.cgi
-rwxrwxr-x 1 nagios nagiosadmin 103380 May 14 17:19 history.cgi
-rwxrwxr-x 1 nagios nagiosadmin 99412 May 14 17:19 notifications.cgi
-rwxrwxr-x 1 nagios nagiosadmin 98088 May 14 17:19 outages.cgi
-rwxrwxr-x 1 nagios nagiosadmin 98836 May 14 17:19 showlog.cgi
-rwxrwxr-x 1 nagios nagiosadmin 143376 May 14 17:19 status.cgi
-rwxrwxr-x 1 nagios nagiosadmin 124616 May 14 17:19 statusmap.cgi
-rwxrwxr-x 1 nagios nagiosadmin 112160 May 14 17:19 statuswml.cgi
-rwxrwxr-x 1 nagios nagiosadmin 107368 May 14 17:19 statuswrl.cgi
-rwxrwxr-x 1 nagios nagiosadmin 115936 May 14 17:19 summary.cgi
-rwxrwxr-x 1 nagios nagiosadmin 120240 May 14 17:19 tac.cgi
-rwxrwxr-x 1 nagios nagiosadmin 122784 May 14 17:19 trends.cgi

Are we working on this in 2 threads again? If so, please don’t do that.

Replace the files with newly compiled versions. Perhaps they are corrupt.

Ok will try that.

I have recompiled my nagios and have new cgis in sbin folder. The speed is still an issue. Any more ideas? When I do a top the highest utilization is the status.cgi which has 25% of the CPU. Any similar situation like mine?

out of interest what is the bandwidth usage like?

regards

AndiC

The connection is an FE (fast ethernet) The bandwidth is 100Mb.

sorry i meant whats nagios’s usage of the bandwith like?

I just wondering what the limits of my setup will be

regards

Andrew

AndiC, I wouldn’t want to take the performance from graeman yet, since he is having trouble with his.
There is a person in this forum that has over 5000 service checks being performed, with only little trouble.

I strongly think you have trouble with your apache setup and not nagios specifically. Why the link to status.cgi takes up so much cpu has to be a compile error or something. Once the page is displayed, there should be 0% activity.

Try this;
meulie.net/forum_viewtopic.php?21.1997

Show us the output of ldd.

No errors in my ldd. =)

ldd *

avail.cgi:
libc.so.6 => /lib/i686/libc.so.6 (0x4001f000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
cmd.cgi:
libc.so.6 => /lib/i686/libc.so.6 (0x4001f000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
config.cgi:
libc.so.6 => /lib/i686/libc.so.6 (0x4001f000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
extinfo.cgi:
libc.so.6 => /lib/i686/libc.so.6 (0x4001f000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
histogram.cgi:
libgd.so.1.8 => /usr/lib/libgd.so.1.8 (0x4001f000)
libz.so.1 => /usr/lib/libz.so.1 (0x40051000)
libm.so.6 => /lib/i686/libm.so.6 (0x4005f000)
libpng12.so.0 => /usr/lib/libpng12.so.0 (0x40081000)
libjpeg.so.62 => /usr/lib/libjpeg.so.62 (0x400a4000)
libc.so.6 => /lib/i686/libc.so.6 (0x400c2000)
libfreetype.so.6 => /usr/lib/libfreetype.so.6 (0x401fb000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
history.cgi:
libc.so.6 => /lib/i686/libc.so.6 (0x4001f000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
notifications.cgi:
libc.so.6 => /lib/i686/libc.so.6 (0x4001f000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
outages.cgi:
libc.so.6 => /lib/i686/libc.so.6 (0x4001f000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
showlog.cgi:
libc.so.6 => /lib/i686/libc.so.6 (0x4001f000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
status.cgi:
libc.so.6 => /lib/i686/libc.so.6 (0x4001f000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
statusmap.cgi:
libgd.so.1.8 => /usr/lib/libgd.so.1.8 (0x4001f000)
libz.so.1 => /usr/lib/libz.so.1 (0x40051000)
libm.so.6 => /lib/i686/libm.so.6 (0x4005f000)
libpng12.so.0 => /usr/lib/libpng12.so.0 (0x40081000)
libjpeg.so.62 => /usr/lib/libjpeg.so.62 (0x400a4000)
libc.so.6 => /lib/i686/libc.so.6 (0x400c2000)
libfreetype.so.6 => /usr/lib/libfreetype.so.6 (0x401fb000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
statuswml.cgi:
libc.so.6 => /lib/i686/libc.so.6 (0x4001f000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
statuswrl.cgi:
libm.so.6 => /lib/i686/libm.so.6 (0x4001f000)
libc.so.6 => /lib/i686/libc.so.6 (0x40042000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
summary.cgi:
libc.so.6 => /lib/i686/libc.so.6 (0x4001f000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
tac.cgi:
libc.so.6 => /lib/i686/libc.so.6 (0x4001f000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
trends.cgi:
libgd.so.1.8 => /usr/lib/libgd.so.1.8 (0x4001f000)
libz.so.1 => /usr/lib/libz.so.1 (0x40051000)
libm.so.6 => /lib/i686/libm.so.6 (0x4005f000)
libpng12.so.0 => /usr/lib/libpng12.so.0 (0x40081000)
libjpeg.so.62 => /usr/lib/libjpeg.so.62 (0x400a4000)
libc.so.6 => /lib/i686/libc.so.6 (0x400c2000)
libfreetype.so.6 => /usr/lib/libfreetype.so.6 (0x401fb000)
/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)