Jump to content

ERROR 500 is disabling my website recently


Go to solution Solved by annedi,

Recommended Posts

Posted

Oh!! OK. Thank you for getting me through this ERROR 500 problem!!

I simply didn't initially realize that the problem was needing larger hosting. Now it all makes sense. 😀 I will go upgrade my hosting service.

Posted

FUDGE! 

I spoke too soon!!

1) We have unlimited bandwidth. Our hosting service assures me that the problem is not one of needed larger hosting.

2) Here is the reply from Site5, our hosting service which indicates that "too many concurrent requests" is not a traffic problem.

Could contain: Text

 

3) I think we might be getting hit on by a bad spider, Bytespider. So I have added a block to ~public_html/content/.htaccess. I don't know if this is the cause of the problem, but it can't hurt to try this block. 

 

Posted

Too many searches being requested by "guests". I have temporarily set the throttling there to 1 hour. Like maybe automated bots are generating too many simultaneous searches?

Posted

Quite simply, if your hosting is not coming with the requests you would need a better host. If its coming from huge amount of requests at the same source, thats something your host should be dealing with really. You can of course block searches from guests, if they are hitting the search bar, if you wanted to do so. See module permissions here

 

  • Solution
Posted

I fumbled and needed to edit the preceding post, but got locked out. So here it is again, complete this time.[hr]

SOLUTION (some of this is "notes to myself" for future reference")

After I created a Trouble Ticket, the Site5 tech discovered a Quality-of-Service error ("too many concurrent requests") for UVP in the server log of our shared server. Site5 included an excerpt from the log, but made no suggestions for dealing with the problem.

I ran a WhoIs on one of the IP addresses associated with the QoS error and learned that the requests originated from Singapore.

That led me to ask: were we being hit on or scraped by a bad crawler?

UVP has instructions in its robots.txt that instruct such bots and crawlers to go slowly. But obeying robots.txt is not mandatory. Bots misbehave all the time. If a bot is creating too many simultaneous requests for different pages and page elements, the server and the software cannot cope with the dense traffic.

Next I looked at our website statistics (cPanel>Metrics>Awstats), where it was clear that there was indeed some unusual Robot/Spider traffic beginning in May and dramatically increasing in the first half of July. In July we all saw constant ERROR 500 pages.

Then I looked in our UVP raw access logs (cPanel>Metrics>Raw Access) in order to attempt to find who was hitting on UVP excessively. The request string IP address in our raw access log matching the Singapore IP address belonged to Bytespider. (Google it to see that it has a bad bot reputation.)

Bytespider was blocked in .htaccess. UVP's web server log (cPanel>Metrics>Errors). Our web server log is now showing several current denials to Bytespider. Will Bytespider ever just give up trying to hit on us? Probably not.

We are no longer seeing any ERROR 500 pages. While I do not have rigorous proof that Bytespider was causing the problem, I'm going to assume for now that the problem has been fixed. I will continue to monitor more closely both our Unique Visitor traffic and our Robots/Spider traffic.

[hr]

FWIW, our Unique Visitor count never exceeds 8500. This is well within our shared server service limits. I was assured by Site5 that we do not need a larger service plan. We are really a small, specialized forum and typically do not exceed 50 viewers per day. We rarely have more than 20 viewers at any one time.

[hr]

Perhaps my write-up here will be helpful to someone else. I hope so. 

Can you delete the first, incomplete SOLUTION post? Thanks.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...