Jump to content

Numbered

Members
  • Posts

    310
  • Joined

  • Last visited

  • Days Won

    1

Reputation Activity

  1. Like
    Numbered got a reaction from SeNioR- in Large community? You have a problems with sitemap!   
    Found one more sitemap problem.
    <lastmod> tag show generation time of the current sitemap file. It's right, but.. What is tell standard
    So, coming back to our case, now we have 5271 sitemap files. So google should get all of them! He get information 'it's modified! take it' and doesn't matter content inside changed or not. Moreover - inside sub-sitemap with url's we didn't have any <lastmod> tags. So google get very old url to subsitemap file, get it and see just list of urls without additional meta information.
     
    My proposal:
    add <lastmod> tag to every url inside all sub-sutemaps. It will tell google which urls contain new elements and which should be scan and it tell which one not changed and not need to re-scan => will optimize scan perfomance.
    Add <lastmod> tag to index sitemap file, which never tell date of this file generation - it should provide newer date of last modified url inside this file. With that google never download sitemap with 500 urls where no changes exist => will optimize scan perfomance.
    P.S. I'll try to create a patch. If i do this - i'll share it here (for other dev's checks and helping IPS).
    Thanks for you attension and support )
  2. Thanks
    Numbered got a reaction from SeNioR- in Large community? You have a problems with sitemap!   
    Support answered 
  3. Like
    Numbered got a reaction from sadams101 in Large community? You have a problems with sitemap!   
    Found one more sitemap problem.
    <lastmod> tag show generation time of the current sitemap file. It's right, but.. What is tell standard
    So, coming back to our case, now we have 5271 sitemap files. So google should get all of them! He get information 'it's modified! take it' and doesn't matter content inside changed or not. Moreover - inside sub-sitemap with url's we didn't have any <lastmod> tags. So google get very old url to subsitemap file, get it and see just list of urls without additional meta information.
     
    My proposal:
    add <lastmod> tag to every url inside all sub-sutemaps. It will tell google which urls contain new elements and which should be scan and it tell which one not changed and not need to re-scan => will optimize scan perfomance.
    Add <lastmod> tag to index sitemap file, which never tell date of this file generation - it should provide newer date of last modified url inside this file. With that google never download sitemap with 500 urls where no changes exist => will optimize scan perfomance.
    P.S. I'll try to create a patch. If i do this - i'll share it here (for other dev's checks and helping IPS).
    Thanks for you attension and support )
  4. Like
    Numbered got a reaction from crmarks in Large community? You have a problems with sitemap!   
    Found one more sitemap problem.
    <lastmod> tag show generation time of the current sitemap file. It's right, but.. What is tell standard
    So, coming back to our case, now we have 5271 sitemap files. So google should get all of them! He get information 'it's modified! take it' and doesn't matter content inside changed or not. Moreover - inside sub-sitemap with url's we didn't have any <lastmod> tags. So google get very old url to subsitemap file, get it and see just list of urls without additional meta information.
     
    My proposal:
    add <lastmod> tag to every url inside all sub-sutemaps. It will tell google which urls contain new elements and which should be scan and it tell which one not changed and not need to re-scan => will optimize scan perfomance.
    Add <lastmod> tag to index sitemap file, which never tell date of this file generation - it should provide newer date of last modified url inside this file. With that google never download sitemap with 500 urls where no changes exist => will optimize scan perfomance.
    P.S. I'll try to create a patch. If i do this - i'll share it here (for other dev's checks and helping IPS).
    Thanks for you attension and support )
  5. Like
    Numbered got a reaction from supernal in Large community? You have a problems with sitemap!   
    IPS Sitemap generator using special database table source for refreshing - core_sitemap.
    Primary search engine source of sitemap is url https://example.com/sitemap.php which is list of sub-sitemap files. You can see list of that files proceed for this link.
    Each of that file contain no more than 1000 urls to specail pages (profile status, topic (without number of pages or comment) and other elements, with supported sitemap as core extension).
    One of our case is forum with more than 100k topics, more than 4.2kk posts and more than 6kk users. So with simply math we have 5214 sitemap files (you can simply count number of that files with command 
    select count(*) from core_sitemap; // 5214 Sitemap generator task run by default once per 15 minuts and update only one oldest element from that big list. With simple math we can try to answer question 'how many time we need for update everything?' (because users can post not only in newest and may post in some old topics... but.. new created topic will add to sitemap file only when ALL older files will newer than current file with new topic inside). So, how much time we need for update?
    5214*15 = 78210 minuts = 1303 hours = 54 days! 54! days! Search engine will add your newest content after 54 days after them posted. Incredible thing. Not believe? Or want to know this lag for your community? You can simple know your lag time with that sql:
    select FROM_UNIXTIME(updated,'%a %b %d %H:%i:%s UTC %Y') from core_sitemap order by updated asc limit 1; // Wed Nov 01 14:13:49 UTC 2017 Yep.. In our case oldest file last updated in 1 November...
    What we should do for fix it? Very fast solution - create a temp file, like a 'mycustomsitemapupdater.php' with this content:
    <?php require 'init.php'; $generator = new \IPS\Sitemap; $generator->buildNextSitemap(); $last = \IPS\Db::i()->select('FROM_UNIXTIME(updated, "%a %b %d %H:%i:%s UTC %Y")', 'core_sitemap', null, 'updated asc', 1)->first(); print_r('Oldest time now: ' . $last . PHP_EOL); And run it via web or cli so times, what you want (before oldest time not be so old).
    Solution for a longer time - add this script to the cron and run it every minute or, which better - change task 'sitemap generator' run time from 15 mins to one minute (but it may be not solve you special problem, if you need to update it faster - do it with smart).
    Better solution - wait for IPS updating of that system.
    Thanks for attension!
    P.S. If you read my text with negative speach - it's wrong. I love IPS and just want to make attension for that problem and help others with their large communities. 
  6. Like
    Numbered got a reaction from DSystem in Large community? You have a problems with sitemap!   
    Found one more sitemap problem.
    <lastmod> tag show generation time of the current sitemap file. It's right, but.. What is tell standard
    So, coming back to our case, now we have 5271 sitemap files. So google should get all of them! He get information 'it's modified! take it' and doesn't matter content inside changed or not. Moreover - inside sub-sitemap with url's we didn't have any <lastmod> tags. So google get very old url to subsitemap file, get it and see just list of urls without additional meta information.
     
    My proposal:
    add <lastmod> tag to every url inside all sub-sutemaps. It will tell google which urls contain new elements and which should be scan and it tell which one not changed and not need to re-scan => will optimize scan perfomance.
    Add <lastmod> tag to index sitemap file, which never tell date of this file generation - it should provide newer date of last modified url inside this file. With that google never download sitemap with 500 urls where no changes exist => will optimize scan perfomance.
    P.S. I'll try to create a patch. If i do this - i'll share it here (for other dev's checks and helping IPS).
    Thanks for you attension and support )
  7. Like
    Numbered got a reaction from SeNioR- in Large community? You have a problems with sitemap!   
    IPS Sitemap generator using special database table source for refreshing - core_sitemap.
    Primary search engine source of sitemap is url https://example.com/sitemap.php which is list of sub-sitemap files. You can see list of that files proceed for this link.
    Each of that file contain no more than 1000 urls to specail pages (profile status, topic (without number of pages or comment) and other elements, with supported sitemap as core extension).
    One of our case is forum with more than 100k topics, more than 4.2kk posts and more than 6kk users. So with simply math we have 5214 sitemap files (you can simply count number of that files with command 
    select count(*) from core_sitemap; // 5214 Sitemap generator task run by default once per 15 minuts and update only one oldest element from that big list. With simple math we can try to answer question 'how many time we need for update everything?' (because users can post not only in newest and may post in some old topics... but.. new created topic will add to sitemap file only when ALL older files will newer than current file with new topic inside). So, how much time we need for update?
    5214*15 = 78210 minuts = 1303 hours = 54 days! 54! days! Search engine will add your newest content after 54 days after them posted. Incredible thing. Not believe? Or want to know this lag for your community? You can simple know your lag time with that sql:
    select FROM_UNIXTIME(updated,'%a %b %d %H:%i:%s UTC %Y') from core_sitemap order by updated asc limit 1; // Wed Nov 01 14:13:49 UTC 2017 Yep.. In our case oldest file last updated in 1 November...
    What we should do for fix it? Very fast solution - create a temp file, like a 'mycustomsitemapupdater.php' with this content:
    <?php require 'init.php'; $generator = new \IPS\Sitemap; $generator->buildNextSitemap(); $last = \IPS\Db::i()->select('FROM_UNIXTIME(updated, "%a %b %d %H:%i:%s UTC %Y")', 'core_sitemap', null, 'updated asc', 1)->first(); print_r('Oldest time now: ' . $last . PHP_EOL); And run it via web or cli so times, what you want (before oldest time not be so old).
    Solution for a longer time - add this script to the cron and run it every minute or, which better - change task 'sitemap generator' run time from 15 mins to one minute (but it may be not solve you special problem, if you need to update it faster - do it with smart).
    Better solution - wait for IPS updating of that system.
    Thanks for attension!
    P.S. If you read my text with negative speach - it's wrong. I love IPS and just want to make attension for that problem and help others with their large communities. 
  8. Like
    Numbered got a reaction from BomAle in Large community? You have a problems with sitemap!   
    IPS Sitemap generator using special database table source for refreshing - core_sitemap.
    Primary search engine source of sitemap is url https://example.com/sitemap.php which is list of sub-sitemap files. You can see list of that files proceed for this link.
    Each of that file contain no more than 1000 urls to specail pages (profile status, topic (without number of pages or comment) and other elements, with supported sitemap as core extension).
    One of our case is forum with more than 100k topics, more than 4.2kk posts and more than 6kk users. So with simply math we have 5214 sitemap files (you can simply count number of that files with command 
    select count(*) from core_sitemap; // 5214 Sitemap generator task run by default once per 15 minuts and update only one oldest element from that big list. With simple math we can try to answer question 'how many time we need for update everything?' (because users can post not only in newest and may post in some old topics... but.. new created topic will add to sitemap file only when ALL older files will newer than current file with new topic inside). So, how much time we need for update?
    5214*15 = 78210 minuts = 1303 hours = 54 days! 54! days! Search engine will add your newest content after 54 days after them posted. Incredible thing. Not believe? Or want to know this lag for your community? You can simple know your lag time with that sql:
    select FROM_UNIXTIME(updated,'%a %b %d %H:%i:%s UTC %Y') from core_sitemap order by updated asc limit 1; // Wed Nov 01 14:13:49 UTC 2017 Yep.. In our case oldest file last updated in 1 November...
    What we should do for fix it? Very fast solution - create a temp file, like a 'mycustomsitemapupdater.php' with this content:
    <?php require 'init.php'; $generator = new \IPS\Sitemap; $generator->buildNextSitemap(); $last = \IPS\Db::i()->select('FROM_UNIXTIME(updated, "%a %b %d %H:%i:%s UTC %Y")', 'core_sitemap', null, 'updated asc', 1)->first(); print_r('Oldest time now: ' . $last . PHP_EOL); And run it via web or cli so times, what you want (before oldest time not be so old).
    Solution for a longer time - add this script to the cron and run it every minute or, which better - change task 'sitemap generator' run time from 15 mins to one minute (but it may be not solve you special problem, if you need to update it faster - do it with smart).
    Better solution - wait for IPS updating of that system.
    Thanks for attension!
    P.S. If you read my text with negative speach - it's wrong. I love IPS and just want to make attension for that problem and help others with their large communities. 
  9. Like
    Numbered got a reaction from Silnei L Andrade in Large community? You have a problems with sitemap!   
    IPS Sitemap generator using special database table source for refreshing - core_sitemap.
    Primary search engine source of sitemap is url https://example.com/sitemap.php which is list of sub-sitemap files. You can see list of that files proceed for this link.
    Each of that file contain no more than 1000 urls to specail pages (profile status, topic (without number of pages or comment) and other elements, with supported sitemap as core extension).
    One of our case is forum with more than 100k topics, more than 4.2kk posts and more than 6kk users. So with simply math we have 5214 sitemap files (you can simply count number of that files with command 
    select count(*) from core_sitemap; // 5214 Sitemap generator task run by default once per 15 minuts and update only one oldest element from that big list. With simple math we can try to answer question 'how many time we need for update everything?' (because users can post not only in newest and may post in some old topics... but.. new created topic will add to sitemap file only when ALL older files will newer than current file with new topic inside). So, how much time we need for update?
    5214*15 = 78210 minuts = 1303 hours = 54 days! 54! days! Search engine will add your newest content after 54 days after them posted. Incredible thing. Not believe? Or want to know this lag for your community? You can simple know your lag time with that sql:
    select FROM_UNIXTIME(updated,'%a %b %d %H:%i:%s UTC %Y') from core_sitemap order by updated asc limit 1; // Wed Nov 01 14:13:49 UTC 2017 Yep.. In our case oldest file last updated in 1 November...
    What we should do for fix it? Very fast solution - create a temp file, like a 'mycustomsitemapupdater.php' with this content:
    <?php require 'init.php'; $generator = new \IPS\Sitemap; $generator->buildNextSitemap(); $last = \IPS\Db::i()->select('FROM_UNIXTIME(updated, "%a %b %d %H:%i:%s UTC %Y")', 'core_sitemap', null, 'updated asc', 1)->first(); print_r('Oldest time now: ' . $last . PHP_EOL); And run it via web or cli so times, what you want (before oldest time not be so old).
    Solution for a longer time - add this script to the cron and run it every minute or, which better - change task 'sitemap generator' run time from 15 mins to one minute (but it may be not solve you special problem, if you need to update it faster - do it with smart).
    Better solution - wait for IPS updating of that system.
    Thanks for attension!
    P.S. If you read my text with negative speach - it's wrong. I love IPS and just want to make attension for that problem and help others with their large communities. 
  10. Like
    Numbered got a reaction from Fast Lane! in Large community? You have a problems with sitemap!   
    IPS Sitemap generator using special database table source for refreshing - core_sitemap.
    Primary search engine source of sitemap is url https://example.com/sitemap.php which is list of sub-sitemap files. You can see list of that files proceed for this link.
    Each of that file contain no more than 1000 urls to specail pages (profile status, topic (without number of pages or comment) and other elements, with supported sitemap as core extension).
    One of our case is forum with more than 100k topics, more than 4.2kk posts and more than 6kk users. So with simply math we have 5214 sitemap files (you can simply count number of that files with command 
    select count(*) from core_sitemap; // 5214 Sitemap generator task run by default once per 15 minuts and update only one oldest element from that big list. With simple math we can try to answer question 'how many time we need for update everything?' (because users can post not only in newest and may post in some old topics... but.. new created topic will add to sitemap file only when ALL older files will newer than current file with new topic inside). So, how much time we need for update?
    5214*15 = 78210 minuts = 1303 hours = 54 days! 54! days! Search engine will add your newest content after 54 days after them posted. Incredible thing. Not believe? Or want to know this lag for your community? You can simple know your lag time with that sql:
    select FROM_UNIXTIME(updated,'%a %b %d %H:%i:%s UTC %Y') from core_sitemap order by updated asc limit 1; // Wed Nov 01 14:13:49 UTC 2017 Yep.. In our case oldest file last updated in 1 November...
    What we should do for fix it? Very fast solution - create a temp file, like a 'mycustomsitemapupdater.php' with this content:
    <?php require 'init.php'; $generator = new \IPS\Sitemap; $generator->buildNextSitemap(); $last = \IPS\Db::i()->select('FROM_UNIXTIME(updated, "%a %b %d %H:%i:%s UTC %Y")', 'core_sitemap', null, 'updated asc', 1)->first(); print_r('Oldest time now: ' . $last . PHP_EOL); And run it via web or cli so times, what you want (before oldest time not be so old).
    Solution for a longer time - add this script to the cron and run it every minute or, which better - change task 'sitemap generator' run time from 15 mins to one minute (but it may be not solve you special problem, if you need to update it faster - do it with smart).
    Better solution - wait for IPS updating of that system.
    Thanks for attension!
    P.S. If you read my text with negative speach - it's wrong. I love IPS and just want to make attension for that problem and help others with their large communities. 
  11. Like
    Numbered got a reaction from David.. in Large community? You have a problems with sitemap!   
    Little improvement (5214 elements will update more than 3 days). So you can speed up more this. Just get time needed for one
    time php mycustomsitemapupdater.php // return something like 4 sec So with that you can create a cycle inside for X times to run $generator->buildNextSitemap(); For example in my case - 10 times in one minute. So for 5214 elements i will need 521 minuts for all update (~= 8 hours - not bad). 
  12. Like
    Numbered got a reaction from mark007 in Large community? You have a problems with sitemap!   
    IPS Sitemap generator using special database table source for refreshing - core_sitemap.
    Primary search engine source of sitemap is url https://example.com/sitemap.php which is list of sub-sitemap files. You can see list of that files proceed for this link.
    Each of that file contain no more than 1000 urls to specail pages (profile status, topic (without number of pages or comment) and other elements, with supported sitemap as core extension).
    One of our case is forum with more than 100k topics, more than 4.2kk posts and more than 6kk users. So with simply math we have 5214 sitemap files (you can simply count number of that files with command 
    select count(*) from core_sitemap; // 5214 Sitemap generator task run by default once per 15 minuts and update only one oldest element from that big list. With simple math we can try to answer question 'how many time we need for update everything?' (because users can post not only in newest and may post in some old topics... but.. new created topic will add to sitemap file only when ALL older files will newer than current file with new topic inside). So, how much time we need for update?
    5214*15 = 78210 minuts = 1303 hours = 54 days! 54! days! Search engine will add your newest content after 54 days after them posted. Incredible thing. Not believe? Or want to know this lag for your community? You can simple know your lag time with that sql:
    select FROM_UNIXTIME(updated,'%a %b %d %H:%i:%s UTC %Y') from core_sitemap order by updated asc limit 1; // Wed Nov 01 14:13:49 UTC 2017 Yep.. In our case oldest file last updated in 1 November...
    What we should do for fix it? Very fast solution - create a temp file, like a 'mycustomsitemapupdater.php' with this content:
    <?php require 'init.php'; $generator = new \IPS\Sitemap; $generator->buildNextSitemap(); $last = \IPS\Db::i()->select('FROM_UNIXTIME(updated, "%a %b %d %H:%i:%s UTC %Y")', 'core_sitemap', null, 'updated asc', 1)->first(); print_r('Oldest time now: ' . $last . PHP_EOL); And run it via web or cli so times, what you want (before oldest time not be so old).
    Solution for a longer time - add this script to the cron and run it every minute or, which better - change task 'sitemap generator' run time from 15 mins to one minute (but it may be not solve you special problem, if you need to update it faster - do it with smart).
    Better solution - wait for IPS updating of that system.
    Thanks for attension!
    P.S. If you read my text with negative speach - it's wrong. I love IPS and just want to make attension for that problem and help others with their large communities. 
  13. Like
    Numbered got a reaction from Ilya Hoilik in Large community? You have a problems with sitemap!   
    Little improvement (5214 elements will update more than 3 days). So you can speed up more this. Just get time needed for one
    time php mycustomsitemapupdater.php // return something like 4 sec So with that you can create a cycle inside for X times to run $generator->buildNextSitemap(); For example in my case - 10 times in one minute. So for 5214 elements i will need 521 minuts for all update (~= 8 hours - not bad). 
  14. Like
    Numbered got a reaction from DSystem in Large community? You have a problems with sitemap!   
    IPS Sitemap generator using special database table source for refreshing - core_sitemap.
    Primary search engine source of sitemap is url https://example.com/sitemap.php which is list of sub-sitemap files. You can see list of that files proceed for this link.
    Each of that file contain no more than 1000 urls to specail pages (profile status, topic (without number of pages or comment) and other elements, with supported sitemap as core extension).
    One of our case is forum with more than 100k topics, more than 4.2kk posts and more than 6kk users. So with simply math we have 5214 sitemap files (you can simply count number of that files with command 
    select count(*) from core_sitemap; // 5214 Sitemap generator task run by default once per 15 minuts and update only one oldest element from that big list. With simple math we can try to answer question 'how many time we need for update everything?' (because users can post not only in newest and may post in some old topics... but.. new created topic will add to sitemap file only when ALL older files will newer than current file with new topic inside). So, how much time we need for update?
    5214*15 = 78210 minuts = 1303 hours = 54 days! 54! days! Search engine will add your newest content after 54 days after them posted. Incredible thing. Not believe? Or want to know this lag for your community? You can simple know your lag time with that sql:
    select FROM_UNIXTIME(updated,'%a %b %d %H:%i:%s UTC %Y') from core_sitemap order by updated asc limit 1; // Wed Nov 01 14:13:49 UTC 2017 Yep.. In our case oldest file last updated in 1 November...
    What we should do for fix it? Very fast solution - create a temp file, like a 'mycustomsitemapupdater.php' with this content:
    <?php require 'init.php'; $generator = new \IPS\Sitemap; $generator->buildNextSitemap(); $last = \IPS\Db::i()->select('FROM_UNIXTIME(updated, "%a %b %d %H:%i:%s UTC %Y")', 'core_sitemap', null, 'updated asc', 1)->first(); print_r('Oldest time now: ' . $last . PHP_EOL); And run it via web or cli so times, what you want (before oldest time not be so old).
    Solution for a longer time - add this script to the cron and run it every minute or, which better - change task 'sitemap generator' run time from 15 mins to one minute (but it may be not solve you special problem, if you need to update it faster - do it with smart).
    Better solution - wait for IPS updating of that system.
    Thanks for attension!
    P.S. If you read my text with negative speach - it's wrong. I love IPS and just want to make attension for that problem and help others with their large communities. 
  15. Like
    Numbered got a reaction from Cyboman in Large community? You have a problems with sitemap!   
    Little improvement (5214 elements will update more than 3 days). So you can speed up more this. Just get time needed for one
    time php mycustomsitemapupdater.php // return something like 4 sec So with that you can create a cycle inside for X times to run $generator->buildNextSitemap(); For example in my case - 10 times in one minute. So for 5214 elements i will need 521 minuts for all update (~= 8 hours - not bad). 
  16. Like
    Numbered got a reaction from media in Large community? You have a problems with sitemap!   
    IPS Sitemap generator using special database table source for refreshing - core_sitemap.
    Primary search engine source of sitemap is url https://example.com/sitemap.php which is list of sub-sitemap files. You can see list of that files proceed for this link.
    Each of that file contain no more than 1000 urls to specail pages (profile status, topic (without number of pages or comment) and other elements, with supported sitemap as core extension).
    One of our case is forum with more than 100k topics, more than 4.2kk posts and more than 6kk users. So with simply math we have 5214 sitemap files (you can simply count number of that files with command 
    select count(*) from core_sitemap; // 5214 Sitemap generator task run by default once per 15 minuts and update only one oldest element from that big list. With simple math we can try to answer question 'how many time we need for update everything?' (because users can post not only in newest and may post in some old topics... but.. new created topic will add to sitemap file only when ALL older files will newer than current file with new topic inside). So, how much time we need for update?
    5214*15 = 78210 minuts = 1303 hours = 54 days! 54! days! Search engine will add your newest content after 54 days after them posted. Incredible thing. Not believe? Or want to know this lag for your community? You can simple know your lag time with that sql:
    select FROM_UNIXTIME(updated,'%a %b %d %H:%i:%s UTC %Y') from core_sitemap order by updated asc limit 1; // Wed Nov 01 14:13:49 UTC 2017 Yep.. In our case oldest file last updated in 1 November...
    What we should do for fix it? Very fast solution - create a temp file, like a 'mycustomsitemapupdater.php' with this content:
    <?php require 'init.php'; $generator = new \IPS\Sitemap; $generator->buildNextSitemap(); $last = \IPS\Db::i()->select('FROM_UNIXTIME(updated, "%a %b %d %H:%i:%s UTC %Y")', 'core_sitemap', null, 'updated asc', 1)->first(); print_r('Oldest time now: ' . $last . PHP_EOL); And run it via web or cli so times, what you want (before oldest time not be so old).
    Solution for a longer time - add this script to the cron and run it every minute or, which better - change task 'sitemap generator' run time from 15 mins to one minute (but it may be not solve you special problem, if you need to update it faster - do it with smart).
    Better solution - wait for IPS updating of that system.
    Thanks for attension!
    P.S. If you read my text with negative speach - it's wrong. I love IPS and just want to make attension for that problem and help others with their large communities. 
  17. Like
    Numbered reacted to forumdev99 in IPS 4.1 : ADVERTISEMENT inside the first post   
    This is working well - thanks for all the help @Upgradeovec 
    I ended up struggling to get the ad block to float to the right of the post content like i wanted it to - so that it isn't above or below the post text (unless on a small screen).
    float:right; didn't seem to work like it should, so I ended up w/ the below code which is the closest I could get to what I want.
    Please share if anyone has a better way of doing this.
    my code:
    <!-- adsense ad code in first post start --> {{if (($comment->position - 1) % \IPS\Settings::i()->forums_posts_per_page === 0)}} <div style="display:inline-block;float:right; max-width:336px;"> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-00000000000" data-ad-slot="00000000000"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script> </div> <div style="display:inline-block;max-width:50%;"> {{endif}} <!-- this is the post content --> {$comment->content()|raw} {{if $comment->editLine()}} {$comment->editLine()|raw} {{endif}} {{if (($comment->position - 1) % \IPS\Settings::i()->forums_posts_per_page === 0)}} </div> {{endif}} <!-- adsense ad code in first post end -->  
    Thanks
     
  18. Sad
    Numbered got a reaction from Sonya* in ¿Tienen sección en español?   
    Can be good choise for create clubs for specific languages here) But i think it never implement because it's a very huge part for some moderation works
  19. Like
    Numbered reacted to Lindy in PAGES - Confused by IPB & the staff   
    Thank you for taking the time to share your feedback. Pre-made templates are something we'd like to do. We'd also like to release our own work we've done here for examples to use. 
    You may find the improvements being made to Pages in a soon-ish release to be of more help to you. Finally, we have a number of tutorials done and on the way that are being reviewed internally. Hopefully all of these combined will maximize the benefit of the product to you. 
    Thank you again.
  20. Like
    Numbered reacted to Cata in PAGES - Confused by IPB & the staff   
    That will be awesome.
    +1
  21. Like
    Numbered reacted to xert77 in PAGES - Confused by IPB & the staff   
    Hi,
    Before I complain, I want to start off with a nice note. Since joining IPB my users have loved the website, commented on how nice it works compared to VB & I have loved your support. When ever I have a bug or problem you guys usually reply pretty fast, sometimes during the night!
    But here is where I am confused, angry & very, seriously sad.
    I love IPB. I think it is one of the most solid products in the market, I think it is also the most professional both company wise (how you guys release things when it's ready & not before) and also product wise, it's very professional.
    For over a year now, I have STRUGGLED to get to grips with pages. I look at the documentation & it's longer than the bible & not very user friendly at all!
    For over a year now I have messaged support asking how to do things in pages & they say I have to come to the forums as it requires custom code! For simple things like adding the article image in the listing page!
    For over a year now I have Tried pages, given up... Tried again, given up. There is so much potential in pages - I have seen it on professional website (websites with coders & budget!) but people like me CANT use it.
     
    Why.... WHY can't IPB make pages easier. Why can't there be a wizard or Pre made templates!
    Front page templates... Do you wanna display articles like a blog (photo to the side with description beside), or like a polariod (image with description below), or like a grid. 1 main article & then smaller articles below etc. Joomla asks you things like... Display title, date, time, author. Main article: 0-10, Secondary Articles 0-10. Its so easy!
    Listing pages... Do you want to display the first image in the article, do you want to display article rating, do you want to display article sorting bar.
    Article pages... Have a page builder, similar to what you do with blocks. Where you can create fields on a preview page & move them around/edit them. Currently if I want an image uploaded in an article and displayed to the right of the content, I have to custom format it, look up HTML guides on Google etc.
     
    I just wish IPB could make pages & frontpage more like Joomla. Joomla frontpage is so easy. Then also keep the difficult custom options for people who are more pro & who know what they are doing.
    I am so annoyed by the fact I can't easilly create a beautiful looking frontpage & article area, that I am considering going back to another software. Your forum software is so easy to use, but pages is rocket science.
     
    Support won't help me, I have seen users here struggle to create things even when working together and helping eachother, and the only other option is to pay someone tons of money... Please consider making pages easier for noobs. PLEASE
    Danny
  22. Like
    Numbered got a reaction from Mack_au in IPS 4.1 : ADVERTISEMENT inside the first post   
    My pleasure 
  23. Like
    Numbered got a reaction from media in IPS 4.1 : ADVERTISEMENT inside the first post   
    My pleasure 
  24. Like
    Numbered reacted to media in IPS 4.1 : ADVERTISEMENT inside the first post   
    OMG, you become my hero man....
    Thank you so much for the insight....
    Thank you thank you thank you.....
  25. Like
    Numbered got a reaction from media in IPS 4.1 : ADVERTISEMENT inside the first post   
    Glad to help 
    As i see below - all worked well? Isn't it?
    I didn't know google ads platform well. But i think it don't neet to get current post number (or not?).
    Anyway - this code just check your first condition - show something (ads) when current post is first post on this page (isn't it?). You can paste inside anything without show current post number. Or, you can call {$comment->position} anywhere inside post template for return that number (to html or js calling - doesn't matter).
    I think i don't understand you well (sorry for bad english).
×
×
  • Create New...