Jump to content

jair101

Clients
  • Posts

    1,228
  • Joined

  • Last visited

  • Days Won

    1

jair101 last won the day on August 26 2018

jair101 had the most liked content!

2 Followers

Recent Profile Visitors

10,563 profile views
  1. Radical Tags is also abandoned though...
  2. 3 questions on my mind: - Will this be integrated with the achievement system? Currently there you can award badges and points based on location of the item, since the new tags aim to replace the current taxonomy, will it be possible to give points and badges based on which tag is used in the content item? - Compared to the above, are the tags available to all or is it possible that certain tags are only restricted to staff? Maybe it was mentioned, but I didn't see it specifically. - I also want to echo the option to mass tag content. I would gladly get rid of some subforums and tag all topics inside instead. I guess it could be a good project for a 3rd party dev, but even better if something like this can be supported natively.
  3. Inline notifications, not emails, online status does not matter - there is no notification even after the user returns after a while. I do have an example of our largest, most active topic, the user says she is subscribed to it, but no inline notifications arrive. Shall I share the specifics, together with ACP access if it is going to help?
  4. I have recently upgraded from 4.6 to 4.17 and some of my members are reporting they no longer receive notifications for topics and users they are following. Unfortunately, it seems that the problem is not easily reproducible, for some topics it works, for others it doesn't, some users do not appear to have the issue, others are more affected. They mentioned that it used to be sporadic even before the upgrade, but now after the upgrade it is much more noticeable. I know this is pretty vague, but any idea what might be wrong, is it server side or client side?
  5. It was the PHP version in my case. Downgradet it from 8.3 and everything works.
  6. @Mister Java, did you manage to resolve this? I think I am hitting the similar issue, but it is with a 3rd party app. I believe it is caused by my server environment, but I am unable to identify exactly which component is responsible. A bit more details: - The app is a video storage app that allows to pull screenshots for the videos either through Youtube API or through upload. - Everything works fine with the upload method, where it fails is the API method. - The app passes the correct screenshot url, it is fairly simple, the URL looks like this: https://img.youtube.com/vi/YEXr3c_gKk0/hqdefault.jpg - When I dump the response from the app I see that the first few bytes of the file are like: ????��JFIF����������??�?������ �� ������(�����1#%�(:3=<9387@H\N@DWE78PmQW_bghg>Mqypdx\eg when I open the file with a text editor on my desktop it is: This leads me to believe that everything is fine with obtaining the file, where it fails is when reading it. In particular, the first 4 bytes are needed to identify filetype, they are ???? for the file downloaded by the app and yOya (simplified) when I manually open the file. The latter is also the signature of the JPG file as explained here: https://en.wikipedia.org/wiki/List_of_file_signatures So right now I am stuck at identifying which server component actually opens and reads this file, I think this is where the source of the problem is. Switching between GD and Imagemagick does not make a difference, moreover when I check the code in Image.php it seems that the file is passed to GD/Imagemagick after the file type is determined, i.e. after already it has been broken. I am using PHP 8.3.7, the issue is only with this particular app and only when uploading from URL, all uploads both url and regular work in the core, regular upload also works in the app. I know I am breaking many of the "sorry, no support" rules here with the unsupported PHP version and the fact that it happens only on 3rd party app. I am not asking for someone to resolve this for me, I am asking for directions I need to dig on my own. In a nutshell, which server component is responsible to open and read the downloaded file? Any tips will be appreciated.
  7. I am doing a new server and wanted to start from scratch, which is different from my previous test environment (I haven't used it in a while). In a hindsight, it shouldn't have been an issue to keep the old test url, but now I already have the lets encrpypt certificates, web server, etc configuration, all with the new url. I guess I can redo that if it is too much effort for you, I thought that the strict rule was just for the main website url, which won't change.
  8. @Marc Stridgen, may I tag along here with the same request? My licensed URL is community.com, my current test url is devtest.community.com. I need to change it to devtest2.community.com and I am unable to do so. Appreciate the help.
  9. Fair enough, this is the case for today. However, I already see that v5 is introducing major changes to the development, I vaguely remember that all plugins need to be made in an app. I've been also burned with custom plugins and apps before. Furthermore, I will be needing my database on regular basis regardless, simply to have a copy outside of IPS environment - as much as I trust their backups, it is a good practice to keep a recent copy at an independent place. Once a month I hope is reasonable. Anyway, thanks for the feedback all, my main question was whether this could be done with Pages or API, I guess it won't be easy, I'll think about it a bit more.
  10. No, I meant that I can execute my queries on the database copy that you provide, I won't be needing you to import it back. Once I have it, I can simply hardcode the values in the Pages block that vizualizes the chart. Now, if providing me a copy of the database once a month will also be expensive, I would have to reconsider my entire strategy from scratch.
  11. Thanks. This is my last resort and possibly unacceptable. Doing this on my own will be a bit of a challenge, depending on 3rd party as well. If there is no some creative solution, I would probably request my database monthly, run the queries offline and update the data manually. Or maybe stick to the self hosted for the foreseeable future.
  12. It is about time that we migrate our community to the cloud. I think it will be mostly fine, but there is one key functionality that right now we do through creating auxillary database table and updating it through an event on a daily basis. This is all done directly in the database and once moving to the cloud I won't have access to it... So I am wondering what is the best way to do that. OK, a bit more details. We are travel oriented community and I have created 200+ Yes/No profile fields the users can toggle for each country they have visited. It looks like this in the profile: https://imgur.com/cCeNNCm So once a day, I am executing a database request that imports the data from the custom profiles in an extra database table with the following columns: id,field_name,iso_code,count 1,field_27,AT,135 2,field_36,DE,1047 etc.. This extra table ensures the mapping between field name, iso code and number of users that have visited the particular country. The query that generates that looks like this: TRUNCATE countries_map; Insert into countries_map (ISO, cntvis) select 'AU', sum(field_27) from core_pfields_content; Insert into countries_map (ISO, cntvis) select 'AT', sum(field_28) from core_pfields_content; Insert into countries_map (ISO, cntvis) select 'AZ', sum(field_29) from core_pfields_content; Insert into countries_map (ISO, cntvis) select 'AL', sum(field_30) from core_pfields_content; Insert into countries_map (ISO, cntvis) select 'DZ', sum(field_31) from core_pfields_content; Insert into countries_map (ISO, cntvis) select 'AS', sum(field_32) from core_pfields_content; Insert into countries_map (ISO, cntvis) select 'VI', sum(field_33) from core_pfields_content; ..... I need this extra query and extra table (countries_map), because we tried to do it by joining the results of 240 queries together and the speed was not pretty. So this query is executed once a day, truncates and populates this table, we are not concerned with live updates for visited countries so once a day is fine. This is all then fed in to this page created with the help of Pages and google geocharts that shows the most visited countries by our members: https://magelanci.com/atlas/ This is a core functionality for our community and I definitely need to find a way to do the same in the cloud without direct database access. So what is the best way to do that? Some magic through the REST API? Store the auxiliary table in Pages database and somehow update/obtain data from it? I cannot think of how to do that... Maybe I can commission some developer to do an app for me for this functionality, however I have been burned a bit by disappearing developers, so I'd rather use core IPS functionality. All ideas and suggestions will be appreciated, this is kind of a blocker for our migration to Cloud, which I definitely need to do as I don't have the time to admin my dedicated server responsibly.
  13. OK, fair point. In this case I think I would take the hit of having the content indexed with 2 weeks delay.
  14. I think there is already a method for Google to bypass the restrictions, based on user agent. At least this mod have this functionality:
  15. I am considering some actions to entice people to register/subscribe etc. based on the date availability of forum content. In a simplest terms imagine that guests will be able to see only topics and replies from up to 2 weeks ago. A large banner will inform them that if they want to access the latest content and engage with the community they need to register/become premium member, etc. Does this sounds like something that is easy to achieve through third party app/plugin? Tagging @DawPi as it seems that you have dived in to guest limits area, though not exactly in this respect 🙂
×
×
  • Create New...