392 private links
Enshittification of Postman (no offline mode), the performance crisis
I know hurl that can be useful. The author lists the ideal API tool:
- local-first
- file-system centric to be stored in the VCS
- zero login wall
- git native collaboration
- native performance
- extensible design
- universal imports (OpenAPI, GraphQL, ...)
- proxy agnostic. It must be designed to proxy traffic through any interception tool. Proxy-aware or browser-based architecture is must have.
- scripting & Auth flows. Pre-request & post-response hooks.
- Straightforward testing. Built-in support for writing and running tests against API responses by code.
For every Postman or Insomnia, there’s a Bruno, Hurl, or Httpie
A backend as a service? It may be useful someday.
Fast, flexible mock API server powered by JSON configuration and a lightweight scripting language (
rjscript).
How to serve typescript file?
Using a proxy mounted on a route that did a passthrough to a vite front-end app And in production we switched out that proxy for a StaticDir.
I use vite. In development vite dev server proxies requests to (axum) backend, for production vite compiles the frontend bits into a bundle that can be served by axum in a specific route. I'm sure ServeDir would work with this setup, but I actually include the bundle in my executable with a small macro which makes deployments stupid simple.
With containers, virtualized processes run natively in the host kernel, like any other. Except that their I/Os are carefully kept segregated from others in the host system.
Thought: containers are often too heavy for the job.
The root cause behind the heavy weight of containers is that they have been built for too many usecases.
WASI is a standard API to give WASM code the ability to do system-level I/O.
Solution?
To try to address this, we wanted to move all these heavy dependencies to a common runtime across services. So your tokio, hyper, sqlx and co (in the case of Rust), now all belong to a long-lived containerized process running persistently in the cloud. Whereas all your service logic, database and endpoint code build into lightweight WASM modules that are dynamically loaded in-place by this global persistent process.