Hello everyone,
i just figured out that the table engine4_core_session is about 1,6GiB with about 3mio datasets.
Rumors after google search telling me that i can empty the table, because this is for temporary reasons only. @SocialEngine. Can I?
thx!
I suggest backing up the database and trying it on a development site. I've never tried it. Be careful the sites you go on as there are unsafe sites out there. Sites with nulled products are pirate sites and contain hackers, malware and other bad things.
hmm... actually it's not possible to move the data to a dev env.
contract and privacy regulations apply to us
would be great to get an official info if this is recommended or not.
the DB-Table is quite large. is this a wanted product result or not?
Sebastian Freytag said:
hmm... actually it's not possible to move the data to a dev env.
contract and privacy regulations apply to us
would be great to get an official info if this is recommended or not.
the DB-Table is quite large. is this a wanted product result or not?
Then you could just create a test environment and generate some sessions. Delete the sessions and see if it works. I'll see if I can find an old thread about this as well.
I thought we had an old forum thread for this but can't find it. It may have been in github.
*Update - I found it in our github list of improvements. We will be making an improvement in an upcoming release to delete the entry in the engine4_core_session table when a user logs out in order to keep this table from becoming massive. I suggest to backup the database and try truncating this table. The only issue that it may cause will be that the users who are currently logged in will be logged out.
improvement sounds nice to me
Hi. Any update here? I have the same problem. When trucate the table, in 30 seconds have 250.000 new records but my registered users is 144. In 24 hours the table have 1,8 gb and 1.8 milions of records
That isn't normal. We use SEPHP current release on this site with no issue. It could be ddos attack (bots). Try anti-spam measures, perhaps cloudflare for now. Or, if you are using AWS or Cloudfront, please check our requirement notes https://www.socialengine.com/support/article/81788966/se-php-socialengine-requirements
No ddos attack, cloudflare active, and the ip's who generate this traffic is from Google Robots in 90% cases. What info is stored in this table? Only the login and logout process or is registered all connections to content present in the site?
You can clear the session junk data in the task manager and see if that helps. Session table has all of the sessions of the users and visitors for logs. Perhaps just clearing the junk data will help. I know there was a thread that I wasn't able to find where a member had deleted their sessions but I wouldn't do that without trying it first on a development site.
I suggest you get your robots.txt file updated so that the bots aren't visiting areas they don't need to go such as the login/signup pages.
Ribelli.TV said:
No ddos attack, cloudflare active, and the ip's who generate this traffic is from Google Robots in 90% cases. What info is stored in this table? Only the login and logout process or is registered all connections to content present in the site?
If it's Google Bots, you might want to (temporarily) block them using robots.txt. If they are still hitting you site or it works.
https://developers.google.com/search/docs/advanced/robots/create-robots-txt?visit_id=637527820766451496-2522724197&rd=1
I have had a look at my site which currently has about 200 users and is blocked to all robots, and I have 4000 entries in the engine4_core_session table.
abuk said:
Ribelli.TV said:
No ddos attack, cloudflare active, and the ip's who generate this traffic is from Google Robots in 90% cases. What info is stored in this table? Only the login and logout process or is registered all connections to content present in the site?
If it's Google Bots, you might want to (temporarily) block them using robots.txt. If they are still hitting you site or it works.
https://developers.google.com/search/docs/advanced/robots/create-robots-txt?visit_id=637527820766451496-2522724197&rd=1
I have had a look at my site which currently has about 200 users and is blocked to all robots, and I have 4000 entries in the engine4_core_session table.
All robots are blocked but the situations is this: https://sendvid.com/no0y0slf
Your logs show a third party plugin for pwa. You'll want to get with that dev to get some help with this as it shouldn't be doing that.
After seeing this post, and for my own knowledge, I had a look at how Session Management in SocialEngine takes place and how the table is populated. It works and is populated as expected on my development site, ie one row for each browser session that I create, that stores for 24 hours and is then removed if the session hasn't been accessed in that time. In my production site, I find that there are many more rows created than actual sessions, but I think this is down to some custom caching I am doing - the 'data' column is blank.
I tried look at your video, but unfortunately I could not get it to load.
so is it safe to empty it? will it work as expected and website with no errors? i don't want to record their sessions and i wish to clear it even every year just to save some space and prevent table corruption due to large size. has anyone done ? many of us don't have the development or test server, we just need an official advisory that we can safely clear this session table and the site will still work without any issues.
I suggest doing a backup before trying it.