Invision Community 4: SEO, prepare for v5 and dormant account notifications By Matt Monday at 02:04 PM
Ofra Yechiel Posted March 31, 2016 Posted March 31, 2016 Hi all, I'm in the process of testing an upgrade from 3.4.6 to 4.1.9. My board contains ~2M posts. I've set up a copy of the board locally (on my laptop) to run the test, and will also use it to run the final upgrade (take board offline > export > import locally > upgrade locally > export locally > upload to server > import > put board back online). As many others have posted before, the lengthiest part of the upgrade for me is the background processes that rebuild different types of data. While it is not mandatory that these processes are complete before going back online, it is definitely very welcome (until the processes finish one can expect all kinds of side effects leading to bad user experience, such as badly formatted posts, inability to search, empty activity streams, and even long response times). So I was wondering - could there be a way to capture the results of this lengthy process after a test upgrade, and reuse those results to save on processing time on following upgrades? Theoretically - everything's possible, of course. But to know how to do this correctly does require some deep knowledge and understanding of the IPS system, which I still do not have. I'm hoping some of the more experienced users here (and maybe even staff members) could help me figure this out, to the benefit of all future upgraders. So to do this correctly, one should know the following: What are the processes we would like to save results for? What are the processes that are the most time consuming? Where are the results of these processes saved? Are they all simply a matter of updating pieces of data "in place"? (i.e. I assume that rebuild post simply replaces the content of the post in the DB with it's processed result. Is this true for every process?) How are items queued for processing? Is there an indication on the item itself whether it is waiting for processing or has already been processed? Can a "last update time" be determined for all of these items? Since we will want to copy data over onto a newer version of the DB, in which some older (already processed) items might have been changed by the users, we'd want to associate a result with the timestamp of the last update time before the processing was done, and only copy the result over for items that still have the same timestamp as they did in the test environment. After determining that an item's result is valid and can be copied over, how would one prevent this item from being rebuilt again? Are there any other considerations to take into account? I would love to get all you people's input on this idea. Thanks, Ofra
Recommended Posts
Archived
This topic is now archived and is closed to further replies.