Jump to content
You are viewing a curated collection of the most significant posts in this topic with an estimated read time of 4 minutes. The full topic contains 31 posts with an estimated read time of 26 minutes.

Featured Replies

Posted
  • Management

Whether you have hours to browse a community or are short on time, scrolling through a very long topic can be more than a little frustrating when you want to follow the topic's core journey.

Recently, we discussed another feature designed to help support-based communities find helpful answers quickly, but what about social topics that do not have a simple question-and-answer format?

We've all come to a lengthy topic for the first time and found it a little intimidating to find the most relevant content among hundreds of posts, which don't always further the topic. These off-topic posts are important because they help social cohesion and build relationships between members at the time of posting. Still, those visiting later often want the truth of the topic.

Invision Community 5 brings a topic summary feature designed to make the most of your time.

Could contain: File, Webpage, Page, Text, Person

The topic summary is generated by an algorithm that uses many touch points such as average read times, reactions, number of shares, external linking and more to determine how useful a post is via a numeric ranking.

The summary shows an estimated read time of the entire topic and an estimated read time using the summary, which gives your members a good idea of the time they'll save.

A shorter read time will make longer topics more accessible to a greater audience.

Could contain: Page, Text

Adjusting the summary
We believe that algorithms should be used to support human decisions but not override them. Those with permission can add posts from the summary if they feel they are more relevant. Likewise, posts can be removed if you think they are irrelevant.

Could contain: Page, Text, File, Person

Interactions with Helpful Posts
Invision Community supports a broad range of communities, including support-based and social communities. We are improving our toolset to help both.

You can have helpful post-voting enabled as well as topic summaries enabled. When this is the case, the topic summary will show until the helpful post-voting meets a threshold. Once that threshold is met, the helpful post information will replace it.

Of course, not all communities and not every forum will have the support features enabled, meaning the topic summary will be the only way to reduce the topic complexity.

Less is more
Browsing the summary gives you a concise view of the topic's journey with no distractions, a vital strategy for growth. 

By allowing members to focus on the core journey, you reward the time they spend on your community and make it more accessible for those short on time.

We hope you've enjoyed this feature introduction and would love to hear your thoughts!


View full blog entry

To be honest, I have abstracted from your version 5. Why??? There are many factors - the entry of AI, especially in social networks, and probably also in forum systems, which is basically yours... There is something elusive, but irritating entry into the IT sphere... And unpredictable. .. I personally find it puzzling and wakes up my conservative thinking... Sorry, I can't be of any help to your version 5 for now...

Me too, actually, I don't expect anything in terms of speed...ha ha...🥰

I clicked into this very excited, thinking you'd be showing off an AI-generated 'executive summary' of topics. Genuinely curious: why not feed those posts with high impact signals into an LLM to summarize? Not long ago your approach would have been great, but now I think will be in danger of feeling pretty clunky, manual and old fashioned. This seemed like it'd be the perfect opportunity to use a modern tool to solve the problem.

  • Author
  • Management
 

To be honest, I have abstracted from your version 5. Why??? There are many factors - the entry of AI, especially in social networks, and probably also in forum systems, which is basically yours... There is something elusive, but irritating entry into the IT sphere... And unpredictable. .. I personally find it puzzling and wakes up my conservative thinking... Sorry, I can't be of any help to your version 5 for now...

Me too, actually, I don't expect anything in terms of speed...ha ha...🥰

We are not using AI models at this time.

Client browser speed will be greatly improved in v5, we're already hitting high 90s in Pagespeed scores for mobile out of the box.

  • Author
  • Management
 

I clicked into this very excited, thinking you'd be showing off an AI-generated 'executive summary' of topics. Genuinely curious: why not feed those posts with high impact signals into an LLM to summarize? Not long ago your approach would have been great, but now I think will be in danger of feeling pretty clunky, manual and old fashioned. This seemed like it'd be the perfect opportunity to use a modern tool to solve the problem.

The problem is that too many people want to throw AI at everything. I've seen loads of AI based 'executive summaries' on things like Helpscout, Amazon Reviews, etc that honestly do not offer much value.

We are happy with using signals and a mathematical model for this feature. @Matt Finger has been discussing building a LLM for many things, so it's an area we are interested in.

I'm really excited about everything, but I think I'm getting tired of waiting for at least a beta. It is certain, but very certain that by this time and on this date they would already launch an alpha, but I was wrong. My idea is to try, as many here are desperately waiting, it's simple, we need to try.

They have said they hope to have a sneak peak by end of the year so they still have a month. You might need to wait a little longer.🙂

  • Author
  • Management

I think it depends on what you’re doing. For example, if you have chosen a book you’ll want to read it page by page but if you’re searching for the right book, you might read the blurb of use the chapter index to see if it’s right for you.

 

We are happy with using signals and a mathematical model for this feature.

Just to be clear, I think this part is a good idea - my suggestion was to go a step further and summarize the contents of those posts pulled out by your own model using AI, rather than just outputting the highlighted posts as-is one after the other.

I agree that just feeding an entire topic consisting of 50% junk into an AI summary probably wouldn't give great results. But if you use your signals to pull out the noteworthy posts and then summarize those, I think the result would be likely be pretty good. AI is really good at summarizing text, after all.

 

I still think that a summary will rob the community of its character and voices. 

It’s still important to see the author next to their content IMO. 

My personal take is that we can wait for Google to scrape all of our content to start offering AI generated summaries on Google (which is already happening - hello Bard!), or we can start doing it ourselves.

I'm sure there's a worthy existential debate about the importance of sourcing content from reputable sources or trusted people on forums, but at the end of the day, if I'm a user searching for the fastest and most accurate solution, do I want a condensed topic (because honestly that's all the topic summary really is) of 30 discrete replies that I still need to read through and interpret or do I want an actual well-written summary that's short, sweet, to the point.

There's a race to convenience that LLMs are unleashing on our public content. We can either own those summaries ourselves, with links and references to other parts of our community, or we can let Google do it for us without ever giving us any traffic.  

Edited by Joel R

  • 4 weeks later...
 

The problem is that too many people want to throw AI at everything. I've seen loads of AI based 'executive summaries' on things like Helpscout, Amazon Reviews, etc that honestly do not offer much value.

We are happy with using signals and a mathematical model for this feature. @Matt Finger has been discussing building a LLM for many things, so it's an area we are interested in.

This is so exciting. Would love to have an AI instance that has read every one of our posts and is available for questions. Tell me what x user said about this event in 2021.

Big dreams, I know.  🙂

Recently Browsing 0

  • No registered users viewing this page.