louis's Blog

louis's Avatar Image
Software developer #Go #CommonLisp #JS #SQL. #LispWorks user. Soft spots for #Emacs #SmallWeb. Recently becoming #OpenBSD enthusiast. #LinuxMint as a daily driver. Recovering Apple addict.

Author of the Tuner app for Linux.

Other hobbies: #Running #FireFighter #StarTrek
← All posts

I’m now almost through migrating PG to MySQL with Stored Procedures only. Ended up with 140 Stored Procedures. The insights I gained into the business domain are incredible.

Now there are some bigger challenges:

  1. How to test an API that literally has hundreds of different endpoints + parameter combinations against the new version

  2. How to transfer data of a 100GB+ sized PG to MySQL in a timely manner so that downtime is reduced to < 15 minutes.

  3. Or even more challenging: how to transfer 60 PG tables to MySQL with a “slightly” optimised schema and a buggy pgdump exporter, that wrongly decodes JSON values into unreadable data (bug filed 2015, maintainers not interested)? Or a buggy PGMySQL Foreign Data Wrapper that fails with Boolean and JSON columns (bug filed in 2020, maintainers not interested)?

I’ve tried 10 different tools that advertise themself as a solution to this and not a single one was able to overcome these challenges (issues with JSON, Timestamp and Boolean columns). Any hints?

So if “interoperability” is a goal of the SQL standard, it clearly failed. If “interoperability” is a benchmark for open source databases, Postgres doesn’t shine at all. All the features that make Postgres “so good” (like ARRAYs which are unknown to every other SQL database, BOOLs and Custom Types) are in fact locking your project in like forever.

However, I’m not the one who gives up easily. I’ll likely end up with a hand-rolled migration tool and then sell it to make a fortune off it, for all those non-existing devs who want to migrate away from Postgres. :neofox_evil:​

#sql #mysql #postgresql

To like or reply, open original post on Emacs.ch