This post was originally written three months ago, when the incident originally occurred. Due to a backlog on my personal time, it was unable to be published until now. Regardless, I still hope this post teaches many important lessons.
Never work on server administration business on six hours of sleep. You tend to violate many safety practices.
This includes checking which packages you’re about to install or uninstall with apt.
If you scroll back far enough into the status page logs, you’ll notice a minor outage that occured on my ownCloud instance a few days ago. Well, minor is an understatement. It was a major outage that threatened the data I had on that service, which included about 19 gigabytes of Key Club images, 71 gigabytes of Speech and Debate images, and 41 gigabytes of my personal data.
The incident all started when a tired, 6 hours slept me decided to add a Pterodactyl instance to my server to handle Minecraft servers. I was simply checking the dependencies and decided to copy and paste a line of apt install instructions from the Pterodactyl website, which wasn’t the best idea. My tired eyes neglected to look over the install confirmation, which included the replacement of MySQL with MariaDB. Now, don’t get me wrong, MariaDB is a perfectly competent database software. However, I wasn’t exactly ready to move to that software from my current MySQL data quite yet. So when I confirmed the installation, things went horribly wrong.
It took about 2 hours of tinkering and 1 hour of Google searching Stack Overflow solutions in order to finally figure out what was wrong and how to solve it:
1. Back up the data so that if I had to do a fresh install, the chances of losing all the database data are dramatically less.
A lot of issues came up with this one basic step. Because of the fact that I couldn’t log in with any of the users I created for my services and the fact that the unix_socket extension failed to load, I had no way to log into the database, whether it be through some other service or through the root user. However, when navigating the system files as root, I found that the files for MySQL were stored under /var/lib/mysql. However, none of the database folders were there because the MariaDB install script had moved them to the mysql-5.7 directory. I ended up copying that directory to my home folder for backup purposes.
2. Reset the database configuration to before the MariaDB install script made its changes.
This proved even more difficult, as the database configuration files were scattered everywhere, whether it be the /etc/mysql directory or the /var/lib/mysql directory. Ultimately, since configuration restoration proved very unviable, I decided to wipe the installation altogether to restore from the backup. This step took multiple commands to complete:
sudo apt purge mysql-client-5.7 mysql-client-core-5.7 mysql-common mysql-server-5.7 mysql-server-core-5.7 mysql-server dbconfig-mysql # This removes any files remotely related to MySQL, including configs. sudo apt purge mariadb* # This removes anything remotely related to MariaDB, including the messed up database configuration. sudo apt update && sudo apt dist-upgrade && sudo apt autoremove && sudo apt -f install # This does everything possible to clear up the messed up package setup caused by the databases. sudo apt install dbconfig-mysql # This reinstalls the MySQL DBConfig for Debian/Ubuntu to manage. This also installs the client as dependencies. sudo apt install mysql-server # This finally installs a fresh new instance of MySQL.
I did run into an issue when reinstalling MySQL, but it turned out to be a random MySQL instance that was still hanging around. A simple “sudo killall mysqld” easily resolved that. From there, I proceeded to install MySQL as a fresh setup through mysql_secure_installation. Finally, I copied the folder I backed up to overwrite the new /var/lib/mysql folder and chown-ed it to mysql:mysql to give it proper permissions. One checkup in PHPMyAdmin proved that this was successful, and I was able to start up my ownCloud client again without any problems.
This leads us to the conclusion. While I was able to preserve all the data this time, it’s clear that I might not be so lucky the next time around. I’ve moved the ownCloud services to a separate Docker instance and have increased database backups to prevent this type of incident from happening again.