Jump to content



  • Posts

  • Joined

  • Last visited

  • Days Won


 Content Type 



IPS4 Providers

Release Notes

IPS4 Guides

IPS4 Developer Documentation

Invision Community Blog



Everything posted by Numbered

  1. You can place it at any place, include not webserver root folder. Moreover, putting this file outside root folder is much better (in all cases like that tools). For working you just need to write correct path to the init.php. Then you can run it manually from cli or with cron or any other technics.
  2. Mainly the result of first patch already exist. Now google webmasters said 4 Jan - most oldest my sitemap file (actually all updated tiday, but google got it more frequently). Second patch with <lastmod> can provide ordering (last column). Now i can't see improvements by statistic. This big works for google get a lot of time and resources (and any update got it too). So need a more time. There is the graph of downloaded size per day I see average goes up and this is good. Numbers in KiB (as legend said). I'll post info here if i get some more intelligent proofs and results. Thanks you for your interest )
  3. It's a my code, which need to insert after $data variable. No. Both of them not affect other. You can run select FROM_UNIXTIME(updated,'%a %b %d %H:%i:%s UTC %Y') from core_sitemap order by updated asc limit 1; and see lag time between last updated file and current date. If they are simular - script already catch current date. Before my scripts running this command showed lag without additional script runs. As I described at the top, on my server this lag time get more 1 month.
  4. May be it's your source of problem. You can check it for your situation and if it's true - make a temp fix.
  5. You can easy to create a plugin, which hooks \IPS\Member class apiOutput method. Just something like a get array result from parent first (because they did a lot of perm checks) and in your hook get any additional data, put it in that array and return all -> it will present in result.
  6. Very very good improvement! Thanks for that. May be have a chance to see OpenID lib in same way (or just abstract class)?
  7. We didn't set storage limits because file sharing is important social thing. Of course, it depends on your possibilities. So for solve storage size task we start using Ceph system. It's like an Amazon S3 and IPS worked well with Ceph (with little fixes with signatures checks). About your primary question - there are no good solution for that. You can do it in two ways - 1) your way with edit every post and remove attachment 2) do it from /attachments/ area (member group should have permission for delete attachments from there). First way is not useful. Second way can broke attachments from posts.
  8. We have a limits for urls number and image dimensions. For example, here i can put 5 urls in signature. So if I put 5 links to YouTube they will show as extremely big area. Moreover - limits didn't work.. ^) (i pasted video x10 times - all works) And one more bug: I remove all content from signature area after that demonstrates. After save editor show to me auto saved content. I clean up editor with 'backspace', saved, but it restored. I click on 'Clear editor' - but never cleaned. It restore old content.. Sometimes users reported about same problem with restore very old content (after few new posts already posted in the same topic) Just try to put a lot of links to youtube in signature and you know the problem) Sorry for my english and somewhere bad explanations.
  9. Any chance to change update method from that big updates to more frequent but lesser in features? I'm already see - i will need to get 4.3 and adopt our apps and plugins totally (some of them created from blank, as i did it when wants to update from 4.1.6 to 4.2). With lesser updates i can simply see changes and adopt current apps/plugins for support new little feature (such as member log, change oauth from our to internal your method and more more other things). With your BIG updates we can't controll all that. It is easier to upgrade app/plugin from scratch than to checks it and understand 'is that ok? and this? and this?'
  10. My way for communication with IPS - write a full information topic in forums first. Discuss with other devs (may be i wrong with something?) and after some time ping via ticket system. Now ticket created (id 996743 - for IPS support). No issues with that. 8k topics will create 8 sub-sitemap files and they will be linked in index sitemap file. And no issue with 5000+ links to the sub-sitemaps from index file. Google saw it successfull. Confirmed. Sitemap will update only if you move post from one topic to another. Sitemap contain 'item' elements, where 'item' is topic, profile status, calendar category.. Sub-items like a posts, calendar event, profile status reply - is just a united content inside 'items'. Google didn't create something like 'tree'. It's positive way for imagine how it works more human undestandable. In fact google saves urls from specified domain and it's content. With sitemap we just help him to do it smarter. Not only with sitemaps. With url params define too. For example, i said 'page' url query in link - is page navigation. So google will know - it should scan every page with ?page=2, ?page=3 and other because it will contain another content. With comment= just link to specified location, not content changes and etc.. Now one time per 15 mins IPS update only 1 sitemap file. Not full update. Yep. It's no good. And aggree with proposal for ability to run manually full update.
  11. Did it. Before: After: No issues detected by several sitemap online checking tools: I did it very ugly. Just for try and check. You can improve it by yourself (and share it with us, please): /applications/core/extensions/core/Sitemap/Content.php line 209: after $data line add that: if (get_class($node) === 'IPS\forums\Forum' && isset($node->last_post)) { $data['lastmod'] = $node->last_post; } and line 259 (line 262 after add previous) add after $data line that: if (get_class($item) === 'IPS\forums\Topic' && isset($item->last_post)) { $data['lastmod'] = $item->last_post; } After that the sitemap script should re-generate all sub-sitemaps for write new data to db. And I haven't done changing correct lastmod in index sitemap, depended on newer date inside sub-sitemap. Thanks.
  12. Found one more sitemap problem. <lastmod> tag show generation time of the current sitemap file. It's right, but.. What is tell standard So, coming back to our case, now we have 5271 sitemap files. So google should get all of them! He get information 'it's modified! take it' and doesn't matter content inside changed or not. Moreover - inside sub-sitemap with url's we didn't have any <lastmod> tags. So google get very old url to subsitemap file, get it and see just list of urls without additional meta information. My proposal: add <lastmod> tag to every url inside all sub-sutemaps. It will tell google which urls contain new elements and which should be scan and it tell which one not changed and not need to re-scan => will optimize scan perfomance. Add <lastmod> tag to index sitemap file, which never tell date of this file generation - it should provide newer date of last modified url inside this file. With that google never download sitemap with 500 urls where no changes exist => will optimize scan perfomance. P.S. I'll try to create a patch. If i do this - i'll share it here (for other dev's checks and helping IPS). Thanks for you attension and support )
  13. Little improvement (5214 elements will update more than 3 days). So you can speed up more this. Just get time needed for one time php mycustomsitemapupdater.php // return something like 4 sec So with that you can create a cycle inside for X times to run $generator->buildNextSitemap(); For example in my case - 10 times in one minute. So for 5214 elements i will need 521 minuts for all update (~= 8 hours - not bad).
  14. IPS Sitemap generator using special database table source for refreshing - core_sitemap. Primary search engine source of sitemap is url https://example.com/sitemap.php which is list of sub-sitemap files. You can see list of that files proceed for this link. Each of that file contain no more than 1000 urls to specail pages (profile status, topic (without number of pages or comment) and other elements, with supported sitemap as core extension). One of our case is forum with more than 100k topics, more than 4.2kk posts and more than 6kk users. So with simply math we have 5214 sitemap files (you can simply count number of that files with command select count(*) from core_sitemap; // 5214 Sitemap generator task run by default once per 15 minuts and update only one oldest element from that big list. With simple math we can try to answer question 'how many time we need for update everything?' (because users can post not only in newest and may post in some old topics... but.. new created topic will add to sitemap file only when ALL older files will newer than current file with new topic inside). So, how much time we need for update? 5214*15 = 78210 minuts = 1303 hours = 54 days! 54! days! Search engine will add your newest content after 54 days after them posted. Incredible thing. Not believe? Or want to know this lag for your community? You can simple know your lag time with that sql: select FROM_UNIXTIME(updated,'%a %b %d %H:%i:%s UTC %Y') from core_sitemap order by updated asc limit 1; // Wed Nov 01 14:13:49 UTC 2017 Yep.. In our case oldest file last updated in 1 November... What we should do for fix it? Very fast solution - create a temp file, like a 'mycustomsitemapupdater.php' with this content: <?php require 'init.php'; $generator = new \IPS\Sitemap; $generator->buildNextSitemap(); $last = \IPS\Db::i()->select('FROM_UNIXTIME(updated, "%a %b %d %H:%i:%s UTC %Y")', 'core_sitemap', null, 'updated asc', 1)->first(); print_r('Oldest time now: ' . $last . PHP_EOL); And run it via web or cli so times, what you want (before oldest time not be so old). Solution for a longer time - add this script to the cron and run it every minute or, which better - change task 'sitemap generator' run time from 15 mins to one minute (but it may be not solve you special problem, if you need to update it faster - do it with smart). Better solution - wait for IPS updating of that system. Thanks for attension! P.S. If you read my text with negative speach - it's wrong. I love IPS and just want to make attension for that problem and help others with their large communities.
  15. Let's explain some situation: BadUser posts some bad posts Moderator1 see extremely bad post and give a warning with a posting restriction - 30 days Next, Moderator2 see old other bad post and give a second warning with a less posting restriction - 1 hour What we see after that? We see 1 day posting restriction and 30 days is gone away. In restriction list we see first and see date, when it's given and see time period for deactivate - 29 days. But in profile we have not see any restrictions - because it was rewrited by new one. I think it's not right logic. Any new posting restriction should check current restriction and if more than new one - NOT redefine it. Or (which is good too) - adding new time to the current time. Other case: Give a warning with just 1 point, which will end on the next day. No restrictions In the same day give a second warning with 0/any points and posting restriction 30 days On the next day 1 points go to inactive state and... remove posting restriction from second warning! Extremely strange. In both cases we have not any auto actions/sanctions by points (nothing). Checked on the clean IPS test installation in the versions, 4.2.6 and 4.2.7.beta3 - same on the all
  16. Can IPS create some resolutions for topics in this area? Such as 'planned', 'on discuss', 'never be implement', 'implemented', 'fixed' or other for this topics? My proposal looks like bug tracker, but here you can see 800+ pages with feedback and ideas? Currently this section looks like 'write please you ideas on no-reply@example.com'. I think me and others want to obtain any answer for their feedback and ideas. Somebody want to know need wait reasization in IPS, somebody need to solve their situations. And this status changes will be provide that. Many of ideas are interesting, some ideas - for very special cases (which will not implement as 'base') - just answer to us) We not toxic, we love you product, we recommend it to other colleguages and bussiness partners. Be more communicative, please) Very good way for do good communications - read an experience of online games communities. There a lot of good and negative experience. And good provide very powerful ending product and with a lot of passion ) As this topic example - now i didn't know what i need to do for resolve my situation. Ask CM's for 'recommend only theyself posts', need to create some block mechanism by myself or wait for the some future version of IPS and for that moment didn't use that feature? I haven't answer..
  17. Reactions are very positive feature! Thanks for them. Here is reactions with our emotions (all given zero-reputation, except plus and minus)
  18. Recommended messages is a very cool feature. Very nice and useful to have it. But in some situations it can provide negative result (for any reasons, such as toxic user). If someone post a good message for topic our moderator set it as 'recommended' and question topic closing. But author of recommended message can edit his post content for something negative with additional negative content. Will be very nice if recomended message didn't change their content if source post edit. I didn't know how to provide this logic correct (may be use post history and get post content from that place if it exist?) but anyway i think you can do it better than me. Thanks for you attension.
  19. Hi friends! I make now application, which should work on 4.1.19 version of IPS. This app will deploy before production will upgrade to 4.2. And i want for make this app working on two versions. All code inside worked now well on both versions, but i need to log member info (ip address and useragent). In 4.2 it must get from Devices, which are not present in 4.1. So my question is: how can i get current version from code for make right logic for each of that versions? Something like \IPS\Settings:i()->current_ips_version.. ) or any. Couldn't find by himself good source of that. upd. found it \IPS\Application::load('core')->version thanks for me)
  20. Did you already tried to temporary disable 3rd pary apps and plugins and re-check again?
  21. In 4.1 users reported about members search, which nicknames started with underscore (any count of them). For search test i registered new user @___Upgradeovec___ here and the search by members returned that: (and mentions showed that) It's not so bad, because in older search it was found nothing. But, is that results are correct? And one more question - will we need to wait searching ability in archieved topics? This point is one blocker, which stop us to use it. Thanks! upd.: and one more question - have we any chance for obtain 1 symbol max limit (or no limit for characters needed) to search? Our asian users write hieroglyphs which is parsed as one symbol and users can't search by what they actually want to find.
  22. Thanks for you work IPS! You create very powerful, smart and modern systems! Our experience of spam blocking gave to us simple rule: spam result cost must be cheaper than working for bypass instruments, needed for start spamming. We have a different user database and registration, outside from framework. So most of systems we can't use. But our solution is simple too - users can start posting something in newbie support and offtopic forums only after obtain more than 10 battles from game. More popular and serious forum categories will open after 100+ game battles. This simple rules are very expensive for spammers for make a spam, which will delete after few minuts before posting (we have moderators, which are working by scheduled plan 24/7). So, i hope some framework systems (such as promotion) will be simplier for extend for make a custom logics. Custom logics with simple development provide very strong solution for clean up all same platforms from scripts at all and humans too (not to easy to register many accounts with different systems. a lot of service, which provide human captcha solver will be out of boat). And thanks again!
  • Create New...

Important Information

We use technologies, such as cookies, to customise content and advertising, to provide social media features and to analyse traffic to the site. We also share information about your use of our site with our trusted social media, advertising and analytics partners. See more about cookies and our Privacy Policy