Jump to content

My Host Shut Me Down Citing Very High Server Load Caused By Forums/ind


Guest rakaposhi

Recommended Posts

Posted

Have a major problem - my forums caused a high server load, leading my host to shut me down. I've been re-enabled since, but its happened a couple of times before over the past year, and I have no idea why. I'm on a shared server, with low traffic, on the latest version of IP Board, Gallery and Blog, only got 2 mods installed, and yesterday IPS support checked it out and said they found no issues.

Here is what happened this time:

"Last evening, your web server paged into our Urge nt team for a high load
issue.

Upon logging into the server, it was found to have a load near 20. Most web
servers run under a load of one, with latency starting around a load of 2,
20 is obviously very high.

Upon investigation (using process programs such as TOP, Apache Status
pages, and Apache Logs), we determine what processes are taking up the most
CPU. In the first instance, after several Apache restarts and checks, the
below file was found to be using 80% of the CPU resources via Apache per
call:

/.../forums/index.php

What this means is that one HTTPD process (this is how it appears in our
process list) alone was using 80% of the entire CPU available to the
server. The script assigned to that process at that given time was your
index.php page. At which point, the script was disabled.

The server then paged again, also for high load. Upon logging into the
server and performing the same exact checks, it was found this time to be
the below file. In this case, the script was using up to 30% CPU per call,
and took server load up to 15.

/.../forums/blog/myblog/index.php

Just because a site is not busy does not mean it cannot have a negative
affect on the server. In this case, your site was not overly busy, traffic
was not the issue. The issue is specifically how much resource is being
used for one attempted load of the above files. Should your site have
actually had a lot of traffic at the same time, the server would not have
been able to survive.

I am sorry for the inconvenience this has caused. But please understand you
are on a shared server, and careful investigation is taken prior to the
disabling of any customers material. The same action would be taken on
someone else should they have been the ones causing the issue.

I can re-activate the scripts upon your response that you will review the
code or software to reduce it's load on the server. But please also
understand that should I do so and it causes trouble on the server again,
it will again be disabled."

So can anyone help? How can the script spike up to using so much CPU? Frankly I am not entirely convinced by IPS Support's response - maybe someone here has encountered something similar???

Any help/advice will be appreciated.

cheers

Posted

blog settings disable Blocked Trackbacks




er.. do you mean set 'Allow Trackbacks' to No?

and you can only set CPU Saving & Optimization for the whole board, not per group... and also, a limit of 200 is illogical is it not, when my host says 'most web servers run under a load on 1'...?
Posted

[b]So can anyone help? How can the script spike up to using so much CPU? Frankly I am not entirely convinced by IPS Support's response - maybe someone here has encountered something similar???[/b]



I've only experienced this with v2.2 on Apache, is that the same version you're using? If so, it might help to upgrade. Cf. bug report and discussion thread. But if you're using the newest version, then I've got no idea.
Posted

I'm using v2.2.2



I have now switched the spider logs off - but I don't think thats the cause...



i hope thats the cause, i had a couple years ago, in 1 day a 3000 spiderlogs. now i always switch it off.
its stored also in you'r database.

also shoutbox is dangered, give's a high traffic load if you have a lot of members.
we set the refresh time now at 20minutes.
first time by install was 15 seconds. lol server complaines to.

greetings
Posted

Ironically, one of the problems when troubleshooting something like this is that it happens so infrequently. If it only happens two times a year, then even if it's very serious, it could prove to be hard to track down what's triggering it.

Posted

er.. do you mean set 'Allow Trackbacks' to No?




My webservers run a load of up to 20 w/ 2 second load times 1 second devoted to executing php due to cache miss's :( but thats no big deal

Thats about 40req/s to index.php :P

your host really needs to learn how to run a webserver, whats your average board use? If you have less then 40active users on any given time a standard apache webserver should be able to handle IP.Board with ease if they don't know what IP.Board is then I would stay far far away from them ;) also if you are paying less then 5-10usd amonth then thats too cheap too.

dheyrman I run shoutbox on with 1sec refresh time with no problem its all in how the company sets up the server, keep in mind I have about 4 other forums with traffic they don't feel the shoutbox on the other site.

So can we get a link to your site so I can analyze if its different then most IPB installs? I can only guess that your host doesn't use php opcode caching such as eaccelerator which is there fault but anyways if I disable it my server just runs a higher load.

Average load is mixed of things, cpu, memory, harddrive transfer, etc

MySQL servers can run loads of 100+ if tuned properly, webservers can utilize a load of about 20 with no problems, it also depends on what else they run on the server, 30% cpu for a php process isn't much, it should finish within a second any longer then its the hosts problem.
Posted

try creating a robots.txt to block all spider bot except google, google adsense, gigablast, msn.

yahoo slurp, voyager, many others tends to suck at high speed causing apache to spawn more httpd than necessary.

my server is co-located and I have been pushed to max causing no access forcing me to reboot the server. now that I have tweaked my robots.txt and it has kept it from maxing out and the load to min.

watch access.log and you will see lots of spider bots crawling at high speed.

Posted

ecantrell, that isn't a way to fix your server, give specs and site traffic use in the server optimization post thats pinned also give your confs and I'll give you a proper setup unless your hardware isn't sufficent.

I've never seen a load higher then 2 naturally and we run high setting srcds servers and bittorrent.

I'd rather have my clients sites spidered by the new un-popular spiders then be stuck with only google and such, you know people use other search engines, also why do you want yahoo blocked?

Posted

ecantrell, that isn't a way to fix your server, give specs and site traffic use in the server optimization post thats pinned also give your confs and I'll give you a proper setup unless your hardware isn't sufficent.



I've never seen a load higher then 2 naturally and we run high setting srcds servers and bittorrent.



I'd rather have my clients sites spidered by the new un-popular spiders then be stuck with only google and such, you know people use other search engines, also why do you want yahoo blocked?



does that means you can do mac os x? :D I don't think you do.

I have been watching Activity Monitor.app and access.log and I can see that. It's spider season. few months ago.. it was relativity quiet... no maxing out till few days ago..

see the chart below.. this is for month of May. hmm... numbers of spider bots has exploded... I had to tweak the robots.txt (this time I block ALL and leave few good open.) Some bot ignore robots.txt so I had to ban them.
Posted

speaking of some bot that ignore robots.txt

on access.log I could see that 38.113.234.180 is still accessing even though I have it banned.

if I let it go thru.. I till fill up "Online Users" and go up to 5 or more pages of all same ip addresses.

Then I'll look at Activity Monitor.app (my server is a mac running apache) and I can see that it has more than 50 httpd and growing. when I add 38.113.234.180 to ban list then I had to restart apache because it got stuck.

and when 38.113.234.180 tries again... their access is foiled and httpd stay fewer and under load.

so 38.113.234.180 is spider bot called "voyager" as per http://adminter.net

Posted

I could setup a mac to use a different httpd other then apache and it'd have a 'ok' load but WHY do you use OSX? Kernel 2.6 has alot better network and stat performance.

If your server is only running a invision board no other high demand sites I'd say your server should perform better. OSX is pretty much unix anyways I've worked on osx terminal a few times, I know people whom use it

I'd never recomend OSX for webserver

Kernel 2.6 for webserver and BSD for mysql always

Posted

your host really needs to learn how to run a webserver, whats your average board use? If you have less then 40active users on any given time a standard apache webserver should be able to handle IP.Board with ease if they don't know what IP.Board is then I would stay far far away from them ;) also if you are paying less then 5-10usd amonth then thats too cheap too.



So can we get a link to your site so I can analyze if its different then most IPB installs? I can only guess that your host doesn't use php opcode caching such as eaccelerator which is there fault but anyways if I disable it my server just runs a higher load.


well I host with pair.com and I've never got the impression that they need to learn how to run a webserver... :)

I would say my site's the same as everyone else's really, but have a look for yourself: www.mambogani.com/forums

My theory is that the blogs are perhaps to blame. I had trackbacks enabled, and perhaps spammers were hitting my blog, although I have 'trackback spamblock' turned on in the Blog options, as well as 'Approve trackbacks' - so there's no way I can just get trackback spammed. Also, I checked my blog and there were no trackbacks needing approval... so that theory looks shaky. However, the fact remains that on the last two occaisons this has happened, the forums script was shut down first, followed a little later by my blog - and I know that the blog relies on the forums code..... in other words, what I'm saying is that if my blog was being hit, perhaps its the forums script that would be flagged as using high resources.....
Posted

It'd be the blog url being hit with the spam, remove the spam trackbacks. Yes your host needs to learn how to run a webserver, your site is tiny it shouldn't ever use 30% cpu on average

Looks like someone may have also sent alot of requests to your index page though, but 80% of the cpu, they should atleast have dual...

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...