Jump to content

iProxy

Clients
  • Posts

    74
  • Joined

  • Last visited

 Content Type 

Downloads

Release Notes

IPS4 Guides

IPS4 Developer Documentation

Invision Community Blog

Development Blog

Deprecation Tracker

Providers Directory

Projects

Forums

Events

Store

Gallery

Posts posted by iProxy

    • Website A is under construction, it is closed to the public via the firewall and maintenance mode is deactivated.
    • Website B is in maintenance mode, but is accessible to the public (beta tester).

    In both cases there's the same problem. The web link we provided points to a sub-domain in which there is no website, but to protect the data with a htpasswd. Both websites must remain inaccessible to the unauthorized public.

  1. I have the impression that there is confusion. The website is online but closed to the public through the functionality provided for this purpose in IPS. Here is a demonstration of the bug on an active website :

    PS Sorry for the previous message it happens that I click a button and I do not manage to find how to delete and there is no option to say cancel.

     

  2. Thank you for this additional information

    Bug Sitemap: I have several websites running on invision, the settings are pretty much the same and on one site it doesn't work that's why I asked for help. I won't go into detail on the subject.

    Bad Crawler/Membre: I noticed on other products, that the guests can see photos in reduced dimensions (300x300) inviting to identify themselves to visualize it in the real size. You tell me that this is impossible, but IPS already does it, but only in IP.Download. You must have realized that the downloaded files are associated with a temporary key hash that prevents bots from downloading the files. Why not extend this feature to the forum and pages?

    A group of members discussing in a password protected section of the forum. If the member opens their web browser and enables offline reading, they can view the page without being logged in and thus access the images, worse they can share the web link of those images to allow others to view them without being a member of the community. This is for me a real problem with the new copyright laws. Some companies are specialized in searching and demonstrating the accessibility of these pictures they even create an account on the forum and index it as a member and not as a guest IPS has no function to be notified of this abuse. When we realize the problem it is already too late, all the methods used proved to be ineffective in the long run. The policy recommends using the hash code regeneration function hashs (md5, sha1, sha-256) to solve the problem which is a very good idea, but I don't know how to do it.

    Regeneration hash this should even be automatic in some cases, when a topic is hidden by a moderator or the topic is moved.

     

     

    1. I have tried it and still get the same error message
    2. I have tried it and still get the same error message
    3. I find your answer simplistic. When a Crawler tries to access a topic that has been removed (from public view) and persists sometimes it's blatantly obvious sometimes it's not, what do you want the host to do it's not his job to monitor developments in "my" community.
    4. Another example if a topic has been made public and then removed by a moderator (invisible) but not deleted, the Crawler can still download the images and files. I'm asking how to regenerate the Hashcode for images/files without knowing to delete them and re-upload them one by one.
  3. Hello,

    1) How change value in yourwebsite.tld/invision/robots.txt ?

    • User-agent: *
      Crawl-delay: 10 -> 30
      

    This information is generated by the website as the file robots.txt does not exist in FTP

    2) While trying to deactivate the generation of the site map I have an error. Whether I put the correct information or not, I always get the same error message.

    Could contain: File, Text, Page, Webpage

     

    3) Which results in the slowing down of the website. Why there are no tools in invasion to see the bad Crawler that does not respect the rules of the robots.txt file, would gain in security by blacklisting (DDoS, Brute Force, Flodding, ...) them if the information (can be blacklisted by being redirected to the site robots.txt).

    4) Although the website is closed to the public with the invision banner (Offline site) the crawler continue to access the images which is problematic. How to regenerate hashcodes of images ?

  4. Hi all,

    For some time, I have noticed that when I interact in the CPAdmin, I generate three errors at each interaction (error, index.html, index.cgi, index.pl). According to my host it's a problem with the template. I get "referer" errors only when I'm logged into CPAdmin, but I don't know what to do to correct this error message claimant. Do you have any idea what I need to do to solve this problem?

     

     

     

     

  5. The website was created with a product that does not exist in the list

    Could contain: Page, Text

    So I have to import the data manually. How can I reactivate this "bug" that facilitated the insertion of data at an earlier date?

    --------------------------------------------

    MAJ: Since I activated the feature it is an error message

    Could contain: Page, Text

    Could contain: Page, Text

    Could contain: Text, Page

×
×
  • Create New...