Jump to content

Full SSL support


Makoto

Recommended Posts

Posted

The problem is that css is not served as files they are all embedded which is a terrible thing




I'm not sure where you are going with this and how it relates to running your site under ssl.
  • Replies 90
  • Created
  • Last Reply
Posted

From an SEO perspective, it's a disaster, unless there is a way to 301 all the existing http pages to https.




Going from http to https will not harm seo or links in any manner, older links will simply be routed to https. Please understand what you are talking about before throwing out words like "Disaster".
Posted

Yes, and not setting up that 301 properly is an SEO disaster.
I'd prefer to see the https option for members only though... so, a good feature request, since many seem to need it.

Posted

If you would like to talk SEO, please start another topic in the feedback section, this topic is about SSL.... Your fear is misplaced on this issue. If it helps you, if you are even using ssl on your site, you can do a simple re-direct as follows to keep all links working 100%. In time they will of course be re-indexed by search engines, during this time and after, the old links will still function as well.


RewriteRule ^(.*)$ https://example.com/$1 [R=301,L]

RewriteCond %{HTTP_HOST} ^example.com$ [NC]

Posted

Yes yes, a 301 redirect, like I said.

There are all kinds of reasons not to use HTTPS for guests/bots... it's a resource overhead, it's slower, it presents some additional SEO risks (e.g. a broken certificate could scare off the bots/search engines)... I only want to do HTTPS for logged in members, or I don't want to do it.

The reason I need this is many of my members (me included when I'm at home) live in a country where government agencies monitor private connections, and saying the wrong thing in the shoutbox, for example, or in a PM, could land us in very hot water indeed. Further, in my home country, it is the websmaster that is prosecuted for illegal comments made by posters, even private comments... I'd like all of that encrypted as an additional layer of protection.

Posted

I'm not sure where you are going with this and how it relates to running your site under ssl.




Where I am going?

Serving all of the sites css code embedded instead of it being cached as css files?

Surely you are jesting, every page load sends all of the css instead of once!
Posted

When I transferred my entire site to SSL, it had no noticeable impact on SEO whatsoever.

I even ended up dropping it due to some Safari issues and images and such not being passed through the SSL layer anyways. Still no issues.

Posted

Where I am going?



Serving all of the sites css code embedded instead of it being cached as css files?



Surely you are jesting, every page load sends all of the css instead of once!



I'm pretty sure you can change that somewhere, my forums (as well as the ones here) are definitely using external css files.
Posted

It does not work when using ssl the setting is ignored by design.

Posted

Yeah .. it was working with some skin edits but they decided to change it ... now the answer is it is too hard maybe we will address it it 3.4

  • 10 months later...
Posted

I know this is an old topic, but its the most relevant one and has pointed me in the right direction to make this work! I now have a forum that works in both http and https.

What I did was, to edit conf_global.php to check whether the script was executing via https or http, and then I changed the $INFO['board_url'] to be either http or https accordingly.

There are lots of php websites out there which show you how to check for https, the actual code that worked for my particular hosting was:-

$INFO['board_url'] = 'http://www.myforum.com';
if( !empty( $_SERVER['HTTP_X_FORWARDED_PROTO'] ) && $_SERVER['HTTP_X_FORWARDED_PROTO'] == 'https' ) {
$INFO['board_url'] = 'https://www.myforum.com';
}

But others may have luck with

$INFO['board_url'] = 'http://www.myforum.com';

if (!empty($_SERVER['HTTPS']) && $_SERVER['HTTPS'] !== 'off'
|| $_SERVER['SERVER_PORT'] == 443) {

$INFO['board_url'] = 'https://www.myforum.com';

}

Dont forget to make a backup of your conf_global.php beforehand incase anything goes wrong!

  • 5 weeks later...
Posted

If you would like to talk SEO, please start another topic in the feedback section, this topic is about SSL.... Your fear is misplaced on this issue. If it helps you, if you are even using ssl on your site, you can do a simple re-direct as follows to keep all links working 100%. In time they will of course be re-indexed by search engines, during this time and after, the old links will still function as well.

RewriteCond %{HTTP_HOST} ^example.com$ [NC]
RewriteRule ^(.*)$ https://example.com/$1 [R=301,L]

When I add this to my .htaccess file along with the code for the rewrite, I get a redirect loop error. Is there another way to force people to use https vs http. Thanks!

Posted

When I add this to my .htaccess file along with the code for the rewrite, I get a redirect loop error. Is there another way to force people to use https vs http. Thanks!

Your board URL in conf_global.php needs to point to the https version of your website.

Posted

Your board URL in conf_global.php needs to point to the https version of your website.

I've got the https URL in the config_global currently and SSL works great. Though, someone can type in the name of my site without the https and can get the site non-SSL. With the .htaccess mod it would force a visitor to use SSL. Though, when I put that into my htaccess file along with the mod_rewrite rule it gives me a redirect loop error.

Posted

Make sure you have your board url set to use https in your conf_global.php file and also make sure you edit the above with your domain replacing example.com and example.com with your site domain etc.

Posted

I know this is an old topic, but its the most relevant one and has pointed me in the right direction to make this work! I now have a forum that works in both http and https.

What I did was, to edit conf_global.php to check whether the script was executing via https or http, and then I changed the $INFO['board_url'] to be either http or https accordingly.

There are lots of php websites out there which show you how to check for https, the actual code that worked for my particular hosting was:-

$INFO['board_url'] = 'http://www.myforum.com';
if( !empty( $_SERVER['HTTP_X_FORWARDED_PROTO'] ) && $_SERVER['HTTP_X_FORWARDED_PROTO'] == 'https' ) {
$INFO['board_url'] = 'https://www.myforum.com';
}

But others may have luck with

$INFO['board_url'] = 'http://www.myforum.com';

if (!empty($_SERVER['HTTPS']) && $_SERVER['HTTPS'] !== 'off'
|| $_SERVER['SERVER_PORT'] == 443) {

$INFO['board_url'] = 'https://www.myforum.com';

}

Dont forget to make a backup of your conf_global.php beforehand incase anything goes wrong!

I implemented this. It works!

However, I did look at the source code on the https version of a page, and the rel=canonical tag is the https version of the page. This could certainly cause some duplicate content issues with search engines, if someone were to post a link to the https version of a page. I am looking for a way to force bots not to index https versions and only the http version. This would be preferable for me because all my back links are http, and I wouldnt want any duplicate content issues.

If I can figure out something slick I will report back

Posted

ok I figured out how to get past the duplicate content issue while serving both unsecured and secured versions of the website, without using any subdomains (from much searching on google, and testing other webstores robots.txt/robots_ssl.txt files on both versions).

1. Create a txt file called robots_ssl.txt

2. paste this into the file:

User-agent: Googlebot Disallow: /

User-agent: *
Disallow: /

4. upload robots_ssl.txt to your IPB root directory

5. For apache servers add this rewrite to .htaccess:

<IfModule mod_rewrite.c>
RewriteEngine on
Options +FollowSymlinks
RewriteCond %{SERVER_PORT} ^443$
RewriteRule ^robots.txt$ robots_ssl.txt
</ifModule>
Doing this will redirect any traffic on the secure socket layer that is requesting your robots.txt file to the robots_ssl.txt (but not on unsecure version. your regular robots.txt will be served). Bam you can now serve both secure and unsecured versions of your website without fear :D IMO it is better to have your non secure pages be indexed. My website did much better in google with non secure pages, and this will avoid duplicate content penalization.
Posted

Make sure you have your board url set to use https in your conf_global.php file and also make sure you edit the above with your domain replacing example.com and example.com with your site domain etc.

I have the conf_global.php setup with https, and I've replaced example with my domain. I've copied my htaccess file contents, maybe I'm doing something incorrectly. In the htaccess file I've added the code from the Friendly URL Features in the Search Engine Optimization section of the ACP. Inside the IFModule I've also added the new code for the re-direct to SSL. When I use the .htaccess file as shown below, I get a server error.

Any help would be much appreciated!

<IfModule mod_rewrite.c>
Options -MultiViews
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule .(jpeg|jpg|gif|png)$ /public/404.php [NC,L]


RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]


RewriteCond %{HTTP_HOST} ^cryptocointalk.com$ [NC]
RewriteRule ^(.*)$ https://cryptocointalk.com/$1 [R=301,L]
</IfModule>
Posted

ok I figured out how to get past the duplicate content issue while serving both unsecured and secured versions of the website, without using any subdomains (from much searching on google, and testing other webstores robots.txt/robots_ssl.txt files on both versions).

1. Create a txt file called robots_ssl.txt

2. paste this into the file:

User-agent: Googlebot Disallow: /

User-agent: *
Disallow: /

4. upload robots_ssl.txt to your IPB root directory

5. For apache servers add this rewrite to .htaccess:

<IfModule mod_rewrite.c>
RewriteEngine on
Options +FollowSymlinks
RewriteCond %{SERVER_PORT} ^443$
RewriteRule ^robots.txt$ robots_ssl.txt
</ifModule>
Doing this will redirect any traffic on the secure socket layer that is requesting your robots.txt file to the robots_ssl.txt (but not on unsecure version. your regular robots.txt will be served). Bam you can now serve both secure and unsecured versions of your website without fear :D IMO it is better to have your non secure pages be indexed. My website did much better in google with non secure pages, and this will avoid duplicate content penalization.

When using the Friendly URL Features htaccess code in the htaccess file would I add this code to the existing rewrite rules or have an additional IFModule. Thanks!

Posted

Be aware that the code I pasted is not for a board running strictly HTTPS. what I posted will prevent robots from indexing https versions of pages, so if you are trying to run https full time, dont do that. What I posted is for boards that are running both http AND https versions, and do not want duplicate content issues.

as for the redirect loop issue when trying to redirect http traffic to https with 301, it was happening to me as well, and I had everything set up properly, so I just took the 301 redirect out, because I couldnt figure a way to make it work. It started happening when I upgraded to 3.4.5

.

Posted

Be aware that the code I pasted is not for a board running strictly HTTPS. what I posted will prevent robots from indexing https versions of pages, so if you are trying to run https full time, dont do that. What I posted is for boards that are running both http AND https versions, and do not want duplicate content issues.

as for the redirect loop issue when trying to redirect http traffic to https with 301, it was happening to me as well, and I had everything set up properly, so I just took the 301 redirect out, because I couldnt figure a way to make it work. It started happening when I upgraded to 3.4.5

.

Thanks for the info. Yea, I'm running all https. I was running mixed http and https when one of my users stated that he was able to use ssl_strip to session hijack. I had everything setup correctly enabling the login only https in the ACP. But he posted that he tested it and was able to achieve it testing it against himself. Once I enabled full https, he was unable to hijack the session. Maybe I set something up incorrectly, but I've been setting up websites for a long time and setting up IPB is pretty straight forward. Not a whole lot I can do wrong.

Question: How would the removal of the 301 look? Using what is below still gives me the error.

RewriteCond %{HTTP_HOST} ^cryptocointalk.com$ [NC]
RewriteRule ^(.*)$ https://cryptocointalk.com/$1 [NC,L]
Posted

I dont know much about session hijacking but there are settings to prevent that in IPB, not sure if you had them enabled or not

post-353186-0-49047200-1371352777_thumb.

As for the redirect loop, when I said 301, I just meant the whole redirect. It wouldnt work for me either. I had to remove it. However, if you have the board_url in the conf_global file set to https://www.yourdomain.com it should force https with out using a redirect in .htaccess. It does for me at least. Not sure why it wouldn't automatically redirect all traffic for you.

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...