When a crawler tries to access a topic that it does not have permission to access, it will get a message denied. It can't do anything else. In terms of IPB trying to figure out what speed your server or site can handle... it can't do this. It does not know if it's on a shared host, or some monster beast, etc. It does not know if it's the only thing running on the server or if there are other sites/applications also running. You need to monitor and control bad bots outside of the software. Good bots will follow robots.txt instructions you place in. Bad ones that don't... you should be blocking from getting to the site at all which is done either at your server denying by the IP addresses or ASN (for example using CSF), etc or within some sort of WAF that sits in front of the site/server.
The best place I would suggest doing this would be within some sort of WAF such as Cloudflare.
There is no way to do this.
If you go to the Sitemap tab... that value should be pre-filled in with something valid.
Can you actually open that path? It should be whatever address you use to access the ACP, but without "/admin", and adding /sitemap.php on the end.
Out of curiosity, have you tried this in a different browser to make sure it's not a browser auto-fill issue and have you also tried disabling any 3rd party resources (applications/plugins)?