Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Huge fan of PGlite.

It's the perfect solution to have Postgres without the need of Docker. With just `npm install`, you can have a Postgres instance on your computer. So, it's extremely easy to onboard a new developer in your team.

And, the good news, PGlite works perfectly with Next.js.

I'm using PGlite in local and development environment with Next.js Boilerplate: https://github.com/ixartz/Next-js-Boilerplate

With only one command `npm install`, you can have a full-stack application, which also includes the database (a working Postgres). And, no need to have/install external tools.



Definitely interesting, though I often just use docker compose for dev dependencies, this is definitely a cool alternative.


Yes, it's very cool alternative and it removes the need for docker compose for dev environment.

On top of it, PGlite also perfect for CI and testing.


Wouldn't be easier to have a shared dev database? So that each developer doesn't have to apply the migrations and import the dumps in their local db, and figure out why the local db for dev A is different than the local db for dev B.


IME, creating scaffolding to allow for a known, reproducible and clean state (of both DDL and DML) are wild systematic (eg across the team and the estate) productivity and stability boosts. Not having to be "connected to the shared dev DB frees engineers of latency (every ms counts) and also of mutulple pollutants to that database (eg Sally was writing protractor/cypresss tests and now the foo table has 3m rows, george was doing data exploration and manually entered records to the DB that now causes a runtime exception fot the team, etc)

If a shared dev DB is really what everyone wants, then at least having the scaffolding mentioned above to fix the DB state when pollution happens (it will) will help heal foot shots. In industrialized practice, what you are mentioning (a shared dev/env) is really the "QA"/pre-prod environment. Ideologically and personally (putting on down vote helmet) if you can't run the whole stack locally you're in for a world of trouble down the road.. local first, test extensively there, then promote changes.


I get frustrated when I join a project that doesn't allow running the full stack locally, but forces sharing parts, which always comes with limitations (not being able to work offline for starters).

It already is quite easy to spin up a local PG instance with Docker, but this probably makes it even simpler. Importing mock data and running migrations should just be 1 `npm run` command with a properly set up codebase.


That sounds cumbersome when devs are working in their respective branches and changing the schema (adding migrations)


A shared dev database can be okay for application dev. Locally deployable database is better imo. By nature of migration scripts running for every dev locally, it helps to better ensure the stability of said scripts. This can/should be combined with snapshot runs that represent your production environments.

More critical still if you have more than one "production" environment, where in different customers/clients may be at different application and db versions from one-another. This is often the case for large business and govt work. With shared environments it's often too easy to have a poorly adapted set of migration scripts.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: