Jump to content

search engines spiders


Guest kissybissy

Recommended Posts

Posted

"Interface
There will be a great focus on usability and streamlining functions. The introduction of a new template engine which allows for multiple default template sets, easier skin editing, and a brand new default skin is something we are very excited about. Search engine friendly URLs will not only make your community more interesting to search engine spiders but allow for more human-friendly linking."

Search engines spiders consume a great deal of bandwidth, which costs money. It was said that the version 3.0 will make the community more interesting to search engines spiders. Will there be any way of preventing the spiders of visiting your forum, kind of a configuration in the CP, without the need to install any kind of program or modification on the forum?

Posted

It's the problem. It's not so simple. As myself, many users do not have the slightest idea how to install those programs. To be honest, all of those codes on that site looks greek to me. Invision does not provide technical support for programs from other companies. We get hand tied. So considering they are making easier for the spiders, it should be offered a feature in the control pannel to prevent them from visiting our site.

Posted

robots.txt is not something made by 'other companies', it's not something you 'install' either. There are plenty of people who are knoeledgeable about how to use a robots.txt file, if you ask someone can tell you exactly what the file needs to contain. All you need to do then is upload that file to your site's root directory.

Posted

The robots.txt file is simply a .txt file (as the extension implies) that you upload to your site. Robots (well-programmed ones at least) look for it before indexing your site, and honor the commands in it.

If you put the following in a robots.txt file it will block all spiders from your site completely

Disallow: /

User-agent: *

Posted

Huh? No, we can't support a robots.txt file, nor can we control the search engine spiders that are out there. ;)

What you CAN do is force spiders to belong to a specific group and deny that group access to the board. This functionality is already built in.

Posted

I simply do not understand this stuff.


bfarber is saying that: no they will not officially provide support for the robots.txt file and that they cannot control the spiders actions (no-one can, other than the each individual spiders programmer), thus they cannot provide a function in the AdminCP that'll stop spiders indexing your board, however, you can prevent the spiders from spending a long time on your board (thus using less bandwidth) by setting the bots in a particular user group that cannot view the board; for guidance on how to do assign bots to a user group submit a support ticket via the IPS Client Area or refer to the 'Peer-to-Peer Technical Support'.

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...