Jump to content

Clover13

Clients
  • Posts

    1,389
  • Joined

  • Last visited

  • Days Won

    1

Reputation Activity

  1. Thanks
    Clover13 reacted to Daniel F in REST API: POST /cms/records/{database_id} failing to create INSERT query properly   
    Sure, send me a PM and we'll take a look.
  2. Thanks
    Clover13 reacted to teraßyte in REST API /cms/records/{databaseId} POST fails with 500 in 4.7.16   
    @Clover13 To reproduce the bug, the database must have the Store revisions option enabled. Most likely the test was made on a database with it disabled.
    I initially thought too the error was coming from saving the record to the database, only after re-checking your last screenshot I noticed it was a revision instead. 😅
  3. Thanks
    Clover13 reacted to Marc Stridgen in REST API /cms/records/{databaseId} POST fails with 500 in 4.7.16   
    Thank you for bringing this issue to our attention! I can confirm this should be further reviewed and I have logged an internal bug report for our development team to investigate and address as necessary, in a future maintenance release.
     
  4. Agree
    Clover13 reacted to teraßyte in REST API /cms/records/{databaseId} POST fails with 500 in 4.7.16   
    Looking again at the screenshot, the error is being thrown when a revision for the record is added to the database, not when the record itself is added. If you're adding a new record, it shouldn't store a revision. A revision should be saved only when you edit a record.
     
    The problem is in /applications/cms/api/records.php in the _createOrUpdate() function (lines 424-443):
    /* Store a revision before we change any values */ if ( $item::database()->revisions ) { $revision = new \IPS\cms\Records\Revisions; $revision->database_id = $item::$customDatabaseId; $revision->record_id = $item->_id; $revision->data = $item->fieldValues( TRUE ); if ( $this->member ) { $memberId = $this->member->member_id; } else { $memberId = $item->author()->member_id; } $revision->member_id = $memberId; $revision->save(); }  
    The IF check should also check if you're editing a record because when adding a new one there is no record ID available yet (thus the column NULL error):
    if ( $type == 'edit' AND $item::database()->revisions )
  5. Like
    Clover13 reacted to teraßyte in Mixin for UI?   
    From the js file (line 61):
    ips.createModule('ips.ui.uploader', function(){  
    To create a new one you need to copy/paste the file in your app, change the name, update the names/references, edit the code you need, and call the new module's name in the HTML.
     
    This is the code that registers the module (I'll use the dialog one which is more used):
    ips.ui.registerWidget('dialog', ips.ui.dialog, [ 'url', 'modal', 'draggable', 'size', 'title', 'close', 'fixed', 'destructOnClose', 'extraClass', 'callback', 'content', 'forceReload' , 'flashMessage', 'flashMessageTimeout', 'flashMessageEscape', 'showFrom', 'remoteVerify', 'remoteSubmit' ], { lazyLoad: true, lazyEvents: 'click' } ); You need to change the dialog name (maybe dialogbis) which registers the widget. All the other values in the array are the options you can use inline.
     
    This is how the code would be with the original dialog module:
    <div data-ipsDialog data-ipsDialog-title="TITLE" data-ipsDialog-remoteSubmit data-ipsDialog-flashMessage="ITEM SUBMITTED"> This is how it would look with your updated dialogbis module:
    <div data-ipsDialogbis data-ipsDialogbis-title="TITLE" data-ipsDialogbis-remoteSubmit data-ipsDialogbis-flashMessage="ITEM SUBMITTED">  
  6. Like
    Clover13 reacted to Marc Stridgen in POST /cms/records/{database_id}: record_image thumbnail sizing doesn't match Pages DB settings   
    This has now been resolved in the 4.7.16 release. Please update if you are seeing this issue. If you are then still seeing the issue, please let us know.
  7. Haha
    Clover13 reacted to Marc Stridgen in Invision Community 5: The story so far...   
    🤔-Thinking about the bug
    👀-Looking for the solution to the bug
    📰-Writing the fix for the bug
    ➡️-Exiting when he realised the bug fix didnt work
    👤-Hiding in the shadow to see if anyone notices
     
  8. Like
    Clover13 reacted to Matt in Invision Community 5: The story so far...   
    New feature blog early next week. 🤔👀📰➡️👤
  9. Thanks
    Clover13 reacted to Matt in Invision Community 5: The story so far...   
    Just six short weeks ago, Ehren hit record on a video that changed everything for Invision Community.
    The blog was called "Introducing a fresh new vision for Invision Community 5," and it ripped up the rule book on what forums should look like and revealed a slick new look featuring a new forum home feed view and sidebar navigation.
    A lot has been discussed, but we're not even close to done!
    Before we bring you news of more features after Thanksgiving, I wanted to take a mid-season break to recap what we've seen so far.
    First up was the introduction video, which gave a broad overview of the new UI Invision Community 5 would be sporting. Ehren takes us through many new elements, including the sidebar navigation, forum feed view, simplified post view and more.
     
     
    Up next was a focus on dark mode, accessibility and mobile views. Invision Community 5 features the ability to have native dark mode without additional themes or complex variables to set up. Our aim with Invision Community 5 is to hide the complexities and technology and just let you focus on creating a great community experience for your audience.
     
     
    Bringing complex theming to everyone was the message in the blog talking about the new theme editor. Now, you can make wide-ranging changes to your theme without the need to edit CSS or manage HTML templates, all driven by a smart and simple interface.
     
     
    Next, it was my turn to talk about a new feature. I introduced two new features designed to help those who run support-based communities. Finding the most helpful answers and identifying community experts help your members do more with less time and frustration.
     
     
    Last week, Ehren demonstrated our new icon and badge builder, which is an amazingly powerful tool to produce slick and professional badges along with the ability to customize your community further with emojis and icons for menus, reactions and more. Building ways to reduce the barrier to customization has been a strong theme for Invision Community 5.
     
     
    Phew!
    We can all agree that we've showcased a lot of impressive functionality coming with Invision Community 5 already.
    But what does the future hold?
    Lots! We have a lot of new functionality that we're putting the finishing touches on, and we can't wait to show you more. These new features further help to reduce noise in topics, make the community feel alive and bring long-needed updates to core components such as the editor. Not to mention, there is a significant update to Pages underway.
    We also have a lot of less flashy updates, such as the new consolidated Feature/Our Picks feature, which is now a single feature.
     
    Feature-window.mp4
     
    An improved Moderators Control Panel brings a more uniform experience across deleted, hidden, and content waiting to be approved.
     

    We're still on course for a release of Invision Community in early 2024 and can't wait for you to experience the future of forums.
    What has been your favourite feature so far? I'd love to know; drop a comment below!

    View full blog entry
  10. Like
    Clover13 reacted to Matt in Uses deprecated APIs ...   
    We are working on having this fixed for the next release. We expect a beta fairly soon and would appreciate a lot of testing as the old plugin touched on a lot of areas.
  11. Like
    Clover13 got a reaction from Viace in Marketplace Closure   
    The problem is, without the current IPS scan/approval process of apps/plugins, any new development is a risk.  We also don't know how many iterations of scan/approval a given version of a given app had to go through to get final IPS approval, nor what those rulesets/barriers were for good practice per IPS standards.  We just know the end product from the Marketplace dev.  Now clients are subject to the intermediate iterations and any issues they expose.  This is particularly concerning when we get into PII and any level of security risk to our sites (which we had a confidence level IPS was protecting us from with their scan/approval process).
  12. Like
    Clover13 got a reaction from Viace in Marketplace Closure   
    Right, so this bodes well for well known and established devs as they have already created a foundational trust model with clients.  For new devs, that's a barrier they'd have to create over time.  Meanwhile, clients either have no way to validate new devs work like IPS previously did to guarantee the safety of the app/plugin.
    I think this greatly elevates the risk for clients and subsequently harms the potential for developers to grow the product.  Hobbyist sites will suffer the most as they simply don't have the resources to invest in robust security evaluation.
    Perhaps another opportunity for a dev to provide some level of AppSec and InfoSec scanning of applications to lower the risk.  
  13. Thanks
    Clover13 reacted to BN_IT_Support in Third Party API integration with Pages   
    As a general opinion on "Pages DB"...
    They are not really "databases" in SQL terms -- but SQL "tables" in the Invision database.
    When you are considering which sort of SQL table to use (within Invision) you have two choices:
    "Pages DB" (as you have mentioned) SQL tables as defined under the 'Database Schema' tab for an Application Pages DB works well for:
    Where you only need a single table for all your data - for example, in the classic database design if you only need a "products" table then it will work well. If you actually need a "products" table and a "suppliers" table with cross references then it does not work so well - and in any case you would need two Pages DB (one for products and one for suppliers)
      Where the solution involves a human adding and editing records it works well - for example, where you have a human adding products (descriptions, part numbers, prices, etc.) to a product table. We probably use between 5 and 10 Pages DB for various unrelated things such as "news articles", "places to go" and so on.
      Using Pages DB automatically gives you various widgets such as "Database filters" which is good.
      I would strongly recommend that you use the standard display templates wherever possible. If you create your own display templates based on the standard display templates then that effectively bases your template on a snapshot of the standard template - you don't benefit from future improvements and bug fixes to the standard. We have some of our own templates from 6 or 8 years ago and regret not putting more effort into display formatting for individual fields. With some clever display formats you can combine fields to produce complex output (for example, latitude and longitude fields to display a map) and if you need to do conditional display of some fields dependent upon the contents of others then you can add a dummy field (with a non-blank default value so it triggers display) and you can write custom code to display the contents of several other fields according to the rules that you want to apply. Do the fancy stuff in a field display rather than a template wherever possible.
      When you write a scheduled task (for example) to update the data in a Pages DB there are a couple of (minor) extra steps that you need to take in order to write the data. Firstly, you need to know the classname that will be used to write to the correct table and this will be something like \IPS\cms|\Records23 where 23 is the database number. So, you have to look up the database number from your database key (I recommend against hard coding the 23 😉 ). Secondly, if you have fields for name, address, region, country, phone, etc. then the names in the table will be something like field_91, field_92, field_93, field_94, etc so you have to 'map' from the field key to the real field name in the database. Not difficult, but that's what you have to do.
      Pages DB will work OK where you have a scheduled task to update the data once per day (for example - it could obviously be much more frequently if you wanted). A scheduled task will work fine where nearly all the data items/records are accessed every day - but what about a scenario where only 5% of records are accessed each day? Doing a daily scheduled update of all records means that 95% of what you retrieve is not going to be used in the next 24 hours. I rather liked Nathan's suggestion of retrieving data on demand (i.e. every record has a lifetime field that you add so if you want the record but the lifetime has expired then you retrieve the data through the API). That would mean that you would not waste resources retrieving data that is not being used. Unfortunately, "retrieval on demand" would be extremely difficult or impossible to implement using Pages DB - unless you write your own very fancy templates and in that case you could do better to write an Application to do the entire job. Updating Pages DB from a scheduled job would work easily. Application tables will work well for:
    You need several tables with relationships between them (e.g. products and suppliers)
      You need an application that does a lot of processing of records (rather than simple human entry of records as indicated previously)
      Access to records is more intuitive - field names are what you want to call them rather than being 'mapped'. Also, access to the correct table is very simple as you need to create a sub-class of \IPS\Patterns\ActiveRecord that will get you to the correct table plus a whole load more stuff.
      If you have hierarchical data (for example, categories and records) then you should consider using the Node/Model/Item/Content model although that gets a lot more complicated it does give you access to many of the features that you see time and again in Invision.
      You have complete control so retrieving data on demand is much more simple than it would be using Pages DB. Regards,
    John
     
  14. Like
    Clover13 got a reaction from SoloInter in Uses deprecated APIs ...   
    This one?
    https://developer.chrome.com/docs/web-platform/deprecating-unload
    If so, Ending with 100% of users by the end of Q3 2024
    Assuming that means anyone on v4 won't be able to use Chrome after the deprecation?
  15. Like
    Clover13 reacted to Nathan Explosion in Third Party API integration with Pages   
    First mention of filtering there, so wasn't aware it was a requirement. If it is then crack on with a Pages DB - you then have to figure out how to keep that data up to date, and when to do it. Alternatively, look at IPS\Helpers\Table\Custom to allow you to create an table based on an array datasource instead of a DB table.
    With all this in mind, I'd now advise you to look into developing an application to do all this instead...
    You can design your own table to store the data You can create a module/controller that will display that data, and add filtering on there easily. You can create a task that runs on a schedule to retrieve/store/update the data. Not much more to add really - if I knew what this mysterious API was, and where it was getting the data from, and what the data looked like then I might even get bored and throw together a POC of it.
  16. Like
    Clover13 reacted to BN_IT_Support in Third Party API integration with Pages   
    Would I be correct in deducing that you're referring to this (on a custom block)?

    If so, no, it would be the data that would be cached so you would lose control of how and when it would be refreshed.
    Certainly, you could use a custom block, but just make sure that you leave 'Cache this block' set to 'No' and then do your own caching.
    The way that Nathan Explosion suggests is by far the best way...
    Firstly, there is "how/when you retrieve the data":
    Scheduled polling would collect all the items even if some have not been updated. Daily polling would work OK if all items are updated every day but if only half the items are updated each day (for example) then you would be retrieving twice as many items as required. If you need the most up to date data for each item then you really need webhooks to notify you of changes - in the absence of webhooks you either set a very short scheduling interval or else use Nathan's solution with a short cache lifetime. You have now told us that you don't mind data which is not completely up to date (say - one day out of date) so webhooks are not essential. Nathan's solution where you load the cache on demand (i.e. only retrieve an item is it is not in your cache OR it is in the cache and already expired) - set the cache timeout to 12 hours or 24 hours or whatever depending how far out of date is OK. If the rate limiting still causes problems then you might want to (need to) randomise the cache lifetime a bit. For example, with a fixed cache lifetime of 1 day if you start with an empty cache and if the users demand all data in a very short time then you might exceed the rate limit. (Not likely, but possible?) With a fixed cache timeout the same thing will happen the next day as all items will expire at the same time. Randomise the cache time with +-60 minutes and lookups will be distributed over time very quickly.
    Secondly, there is "how/where to cache the data":
    If you only need a single lookup key per item (e.g. a name OR an id) then Nathan's suggestion of using the data store is the best way to go. (I've used it where I only need one key per item and it works very well.) If you need multiple lookup keys per item (e.g. name AND id so you sometimes retrieve from cache by name and sometimes by id) then the data store does not have that flexibility so you would probably need an SQL table with multiple keys - unless you double your storage usage by saving each item under both the name key and the id key. (I have multiple tables with multiple keys which is why I think that way 😉 ) John
  17. Thanks
    Clover13 reacted to Nathan Explosion in Third Party API integration with Pages   
    Zero knowledge of what API it is that you are referring to wanting to use so can't look at the data structure, and have no idea about how you want it to be displayed but here is what I would be doing on the page/block...
    Check if the data exists in \IPS\Data\Store If not, retrieve the data via the API and then store the data in \IPS\Data\Store::i()->whateveryourkeyisgoingtobe, adding something to the data to provide a timestamp for it being added. Display the data however it is you wish it to be done. On subsequent loads, step 1 should also be checking if that timestamp should be considered 'out of date' - if it is, get the data again. I don't think you should think about storing the data in a Page DB - I can envision that become overly complex for what it appears that you are trying to do. And if the data store gets cleared, your code would just pull in the data again into the store anyway.
     
     
     
  18. Like
    Clover13 reacted to BN_IT_Support in Third Party API integration with Pages   
    If you're going to go down this path then I would be inclined to use a 'task' within your Invision Application - that way you can use the API to pull data directly into your application rather than pull the data with Zapier or a separate cron script and then post it into your application.
    An important question - how far out of date can you allow the data to be? If the data must be up to date within 5 minutes then your script will have to run every 5 minutes and if your application does not display all the data frequently then you may well generate more activity pulling data on a schedule than you would pulling it on demand.
    Is there a way (in the API) to detect data that has been updated - so that you don't have to pull all the data periodically?
    The most useful thing would probably be if the API includes webhooks that will fire and notify your application if a data item/record is created and when one is updated -- i.e. you then use your webhook to pull data that has changed and don't bother to pull data that has not changed.
    If there are no webhooks in the API as an alternative will the API let you 'list all records that have been updated in last 10 minutes' (for example) - that way you would use the list of recent updates to select which records to pull (i.e. only those that have been updated since last pull).
    Finally, I would be inclined to store the data in your own database (i.e. table defined by your application). Effectively, the database table acts as your cache and if your cache is only updated on data change (webhook or whatever) then it would probably be far more efficient than trying to use the Invision cache (which will have its own timeouts that are not necessarily in sync with updates to the data...)
    John
  19. Like
    Clover13 reacted to Matt in Uses deprecated APIs ...   
    We are not going to simply allow all our customers to never use Google Chrome again. 😅 That would be a terrible decision.
    We will ensure it's resolved ahead of the removal later this year. We use an older plugin for jQuery to manage browser history state. We already plan to move that to native JS APIs in v5 and v4.
  20. Like
    Clover13 got a reaction from sadams101 in Uses deprecated APIs ...   
    This one?
    https://developer.chrome.com/docs/web-platform/deprecating-unload
    If so, Ending with 100% of users by the end of Q3 2024
    Assuming that means anyone on v4 won't be able to use Chrome after the deprecation?
  21. Haha
    Clover13 reacted to Matt in Uses deprecated APIs ...   
    2030 at the earliest, I'd say.
  22. Like
    Clover13 reacted to Jim M in Paginas de blocos   
    Let's take a step back here. We are trying to help but being provided "post content" is extremely vague in our software. We understand it may be confusing to a new user and want to help but in order to so, we need you to help us and describe what you want to do 🙂 .
    Pointing us to that website is rather vague still as there is a ton of content there. Are you wanting to do the "blocks" they have like "Top Downloads" on the right-hand side? Are you wanting to create the forums they have? What exactly are you wanting to do here? If you want to point to something particular on their website either by title or screenshot, we can certainly instruct you on how to do it.
     
  23. Haha
    Clover13 reacted to Charles in Passkeys instead of passwords   
    There are platforms other than Apple?
  24. Like
    Clover13 got a reaction from q p in Google GTM + GA4 + InvisionCommunity Member registration form Event   
    See here:  
     
  25. Like
    Clover13 reacted to Dreadknux in Introducing a fresh new vision for Invision Community 5   
    I have a general question relating to the new V5 design philosophy/restructure if that's okay @Ehren.
    I've been using sticky nav headers in my current V4 setup, where the navigation strip underneath the header area follows the user as they scroll down. That's working very well, with one caveat; I've had to try and shoehorn in a bunch of additional CSS for as many anchored areas on-page as I can think of(i.e. pagination rows, anchored H2s etc) to ensure those anchored areas do not appear hidden behind the sticky navbar.
    Unfortunately it's a little bit of a janky/inelegant solution - my code for each anchored part in my custom CSS goes something like this:
    [data-resort="listResort"][data-tableid="topics"]::before, [data-resort="listResort"]::before { display: block; content: " "; height: 60px; margin-top: -60px; visibility: hidden; pointer-events: none; background: none; } The above code tends to help make sure that anchors are set underneath the sticky navbar (which is 60px in height), but it has the side-effect of pulling the ipsBox design up and above its container, which makes sense logically but is a little annoying (and I've had to find alternative way to fix this by modifying the container CSS further to resolve, on a per-section/per-anchor area basis)

    So basically what I'm interested in knowing about is, do you think there is something in V5's design approach that accounts for this kind of customisation that a community admin might want to do, without requiring custom CSS for a hundred different containers to make it work?
    I don't necessarily mean that there needs to be an option for sticky headers in the theme editor or anything (because there are too many variables at play - different heights of sticky containers, whereabouts that sticky area might be, etc), but perhaps there is a custom CSS variable baked into V5 for advanced CSS editors, where they can input a value for a sticky anchor height, and then the admin can add one additional line of custom CSS to set their desired navbar/area as 'sticky'... and then the set variable would ensure that all anchored on-page areas would be automatically adjusted to account for the sticky header (so nothing appears hidden behind the sticky container).
    It's probably quite a complex thing to ask about, but interested to hear your thoughts on feasibility etc?
×
×
  • Create New...