MaNiAc LRSC Posted February 17, 2022 Posted February 17, 2022 Hello, since activating elasticsearch the logs a piling up...with empty log entries.
Jim M Posted February 17, 2022 Posted February 17, 2022 Looking at these, they are connection errors to your Elastic Search. You will want to contact your hosting provider or server administrator for assistance. SeNioR- 1
Skillshot Posted February 18, 2022 Posted February 18, 2022 There are no errors in the elasticsearch log, the elasticsearch instance has status "green", querying /content/_search from commandline works flawless. I don't see any reason why the IPB instance should not be able to talk to the local elasticsearch instance. If I do a tcpdump I can see tons of these querys: {"query":{"bool":{"must":[],"must_not":[],"filter":[{"bool":{"should":[{"terms":{"index_class":["IPS\\core\\Statuses\\Status","IPS\\core\\Statuses\\Reply"]}},{"terms":{"index_class":["IPS\\forums\\Topic\\Post"]}},{"terms":{"index_class":["IPS\\calendar\\Event","IPS\\calendar\\Event\\Comment","IPS\\calendar\\Event\\Review"]}},{"terms":{"index_class":["IPS\\nexus\\Package\\Item","IPS\\nexus\\Package\\Review"]}},{"terms":{"index_class":["IPS\\cms\\Pages\\PageItem"]}},{"terms":{"index_class":["IPS\\cms\\Records1","IPS\\cms\\Records\\Comment1","IPS\\cms\\Records\\Review1"]}},{"terms":{"index_class":["IPS\\communitymap\\Markers","IPS\\communitymap\\Markers\\Comment","IPS\\communitymap\\Markers\\Review"]}}]}},{"match_none":{}},{"range":{"index_date_created":{"gt":0}}},{"terms":{"index_permissions":[3,"m96298","*"]}},{"term":{"index_hidden":0}}]}},"sort":[{"index_date_created":"desc"}],"from":0,"size":11} with an answer of HTTP/1.1 200 OK and a body of {"took":0,"timed_out":false,"_shards":{"total":1,"successful":1,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":null,"hits":[]}} so the elasticsearch instance is clearly working. SeNioR- 1
Skillshot Posted February 18, 2022 Posted February 18, 2022 As far as I can see, the errors are mostly from the stream subscription task(s): #0 /srv/www/sites/www.germanscooterforum.de/htdocs/system/Http/Request/Curl.php(422): IPS\Http\Request\_Curl->_execute() #1 /srv/www/sites/www.germanscooterforum.de/htdocs/system/Http/Request/Curl.php(298): IPS\Http\Request\_Curl->_executeAndFollowRedirects() #2 /srv/www/sites/www.germanscooterforum.de/htdocs/system/Content/Search/Elastic/Query.php(1235): IPS\Http\Request\_Curl->get() #3 /srv/www/sites/www.germanscooterforum.de/htdocs/applications/core/sources/Stream/Subscription.php(145): IPS\Content\Search\Elastic\_Query->search() #4 /srv/www/sites/www.germanscooterforum.de/htdocs/applications/core/sources/Stream/Subscription.php(90): IPS\core\Stream\_Subscription->getContentForStream() #5 /srv/www/sites/www.germanscooterforum.de/htdocs/applications/core/tasks/weeklyStreamSubscriptions.php(40): IPS\core\Stream\_Subscription::sendBatch() #6 /srv/www/sites/www.germanscooterforum.de/htdocs/system/Task/Task.php(367): IPS\core\tasks\_weeklyStreamSubscriptions->IPS\core\tasks\{closure}() #7 /srv/www/sites/www.germanscooterforum.de/htdocs/applications/core/tasks/weeklyStreamSubscriptions.php(41): IPS\_Task->runUntilTimeout() #8 /srv/www/sites/www.germanscooterforum.de/htdocs/system/Task/Task.php(266): IPS\core\tasks\_weeklyStreamSubscriptions->execute() #9 /srv/www/sites/www.germanscooterforum.de/htdocs/system/Task/Task.php(229): IPS\_Task->run() #10 /srv/www/sites/www.germanscooterforum.de/htdocs/applications/core/interface/task/task.php(58): IPS\_Task->runAndLog() #11 {main}
Marc Posted February 18, 2022 Posted February 18, 2022 There wouldnt be any logs on the elasticsearch side, as the software is unable to connect to it at those points
Skillshot Posted February 21, 2022 Posted February 21, 2022 It seems like something is opening nearly 30.000 connections to the elasticsearch backend without reusing existing connections (leaving a lot of connections in TIME_WAIT), peaking at around 500req/s. The querys all look like POST /content/_search HTTP/1.1 Host: 127.0.0.1:9200 User-Agent: Invision Community 4 Accept: */* Content-Type: application/json Content-Length: 910 {"query":{"bool":{"must":[],"must_not":[],"filter":[{"bool":{"should":[{"terms":{"index_class":["IPS\\core\\Statuses\\Status","IPS\\co re\\Statuses\\Reply"]}},{"terms":{"index_class":["IPS\\forums\\Topic\\Post"]}},{"terms":{"index_class":["IPS\\calendar\\Event","IPS\\c alendar\\Event\\Comment","IPS\\calendar\\Event\\Review"]}},{"terms":{"index_class":["IPS\\nexus\\Package\\Item","IPS\\nexus\\Package\\ Review"]}},{"terms":{"index_class":["IPS\\cms\\Pages\\PageItem"]}},{"terms":{"index_class":["IPS\\cms\\Records1","IPS\\cms\\Records\\C omment1","IPS\\cms\\Records\\Review1"]}},{"terms":{"index_class":["IPS\\communitymap\\Markers","IPS\\communitymap\\Markers\\Comment"," IPS\\communitymap\\Markers\\Review"]}}]}},{"match_none":{}},{"range":{"index_date_created":{"gt":0}}},{"terms":{"index_permissions":[3 ,"m96298","*"]}},{"term":{"index_hidden":0}}]}},"sort":[{"index_date_created":"desc"}],"from":0,"size":11} In the same interval, mysql is peaking at 1400qps.
Marc Posted February 21, 2022 Posted February 21, 2022 Could it be that someone is targeting this? Do you have flood control in place on search?
Skillshot Posted February 21, 2022 Posted February 21, 2022 (edited) The connections are purely local ones. There are no peaks in webserver access. If I follow the traffic via tcpdump everything is originating from localhost, so there is no external access directly on elasticsearch. Edited February 21, 2022 by Skillshot
Management Matt Posted March 1, 2022 Management Posted March 1, 2022 I've moved this into a ticket so our developers can investigate. Marc 1
Recommended Posts