Load protection
2006-01-20 17:27:44.832979+00 by
Dan Lyke
4 comments
Does anyone know of an existing Linux app that will monitor CPU load, and if it gets too high shut down some processes for a while? I know that the box isn't dead, because I can ping it, and I think that if I could set a threshold after which I drop Apache for a few minutes, then bring everything back up, I'd have a more general purpose solution to my problem.
[ related topics:
Open Source security
]
comments in ascending chronological order (reverse):
#Comment Re: made: 2006-01-20 20:45:09.125258+00 by:
Shawn
Well, Hot Babe won't shut down processes, but she'll let you know when your CPU is under load ;-)
[Disclaimer, I haven't installed this yet, so I don't know how useful, or entertaining, it really is. And yes, I agree that now we need Hot Stud too ;-)]
#Comment Re: made: 2006-01-20 21:11:09.491355+00 by:
dexev
[edit history]
#!/bin/bash
# just for fun
# (not tested)
CUTOFF=2
SERVICE='apache'
SHUTDOWN_PERIOD=300 # 5 minutes
RESTART_TIME=0
while true; do
if [ $RESTART_TIME -gt 0 ]; then
NOW=$(date +%s)
if [ $NOW -ge $RESTART_TIME ]; then
/etc/init.d/$SERVICE start
RESTART_TIME=0
fi
fi
LOAD=$(awk -F'.' '{print $1}' /proc/loadavg)
if [ $LOAD -ge $CUTOFF ]; then
/etc/init.d/$SERVICE stop
let RESTART_TIME=$(date +%s)+SHUTDOWN_PERIOD
fi
done
#Comment Re: made: 2006-01-21 13:47:14.516917+00 by:
DaveP
The most useful thing I've done on my server is let the OS and apache do more work, and PHP less.
For example, when I manually check my logs and notice a ref-spammer attack, I DENY FROM that IP
address in my .htaccess file (with a timestamp of when I banned it). I've thought about aging them off
to be nice to people who get a DHCP address that's been banned, but I haven't bothered yet. I just
manually delete the oldest entries when I think of it.
If I'm facing other attacks from an IP, I DENY it at the IP level. Since I'm running OpenBSD, my server
has, as far as that IP can determine, disappeared from the net. There are growing netblocks that can't
even see my server because they're the source of repeated attempts to get in via ssh or sources of lots
of email spam or ref-spam. Or more than one of the above. Any IP address that's denied on more than
one protocol gets denied at the IP level when I notice the duplication.
I also try to make pages I serve need less PHP processing. Cache up the results from a rendering of a
page, and serve that file. There's a cron task that deletes old cached renderings when they become
stale, so all the decisions for whether or not to serve a static page are made by apache. Mod_rewrite
does the thinking for me, rather than PHP.
I've been investigating OpenBSD's ipfw. I've started on some scripts that will identify abuse from an IP
address automatically, but the last time I tested it, I missed one ref-spammer in the script, and false-
positived a google-bot, so I obviously need to do more work on that. But my ideal is to have the ip-
level denies automated.
Finally, my server is connected to the net via a connection that is slow enough that I don't think I could
be brought down by a slashdotting (not that I want to try). The last time I ran the numbers, the pipe
wouldn't support enough bandwidth to bring my server down with legitimate requests. This was
actually done by bandwidth-limiting the connection to my machine on the router. As far as I can tell,
it's never caused a problem for a real user, but it did keep one DOS attempt from causing problems.
#Comment Re: made: 2006-01-21 19:32:46.118579+00 by:
Dan Lyke
I've started adding "Deny" lines to my Apache conf, but for the most part these are coming in from enough different IPs (for the same referrer source to different weblog entries) that I don't think that's a solution, unless I can automate it.
I did two things on top of my earlier attempt to check for bogus referrers:
- I've throttled Apache to a third of the
default max processes.
- I've changed a few of my heaviest database query pages so that if
the referrer is from outside then there's a super light-weight "click
again to get the page you want" page given to the user. Since these
are all excluded in my robots.txt, this shouldn't have any real effect
on user experience.
But I do think I'll also try to quantify what's happening when the machine gets that bogged down and put some "start killing things" processes in place too.
The Flutterby server is on way more pipe than is reasonable or necessary (one of the brilliant decisions made after I left Chattanooga.net was buying a building next to the railroad tracks right by most of the local telco switches, because of various regulatory issues there are apparently telco companies paying to just run cables in and back out of the building), but one of the things I might do if the hosting situation remains stable is put two interfaces in it, provide all of the external access on one card and do my admin stuff via the other, and try to balance load so that there's always some bandwidth available on the admin interface.