Daily Shaarli
October 7, 2025
- Find out what is seen as valuable
- Deliver value as often as possible to get feedback
- Write and maintain integration tests that survive refactoring
- Avoid Object-Oriented Programming, or at least be extra careful with it
- Remember you can still add in that complication tomorrow
- Be conscious of what makes you over-engineer
- Get yourself a better definition of perfection
After a few years of using both (see Optimizing SQLite for servers for example), I've found that SQLite particularly shines when used for internal services or public services where a small amount of downtime is tolerable.
So, I choose PostgreSQL (preferably with a managed provider) if the service needs (close to) 100% uptime, if the service needs more than 5 Gbps of bandwidth or if the database is expected to grow larger than 200GB. [...] Bascially, all the situations where running on a single server is not possible.
It's important to note that with the advent of DuckDB, Parquet and Apache Iceberg, there is less and less reasons to stuff your main database with
useless junktimeseries data, and instead only keep some kind of materialized view and send the raw data to S3. Thus, there are less and less reasons for your main database to be over 200 GB.