Hacker Newsnew | past | comments | ask | show | jobs | submit | pbrumm's commentslogin

There is wal-g that moves the wal files to s3 and you can spin up any number of instances off of that. Works great for catching up secondary servers


just a quick look on it, is that more like pg_dump or barman?


Have you tried switching it to a job queue where the GPU instances try to keep themselves busy. That way you can auto scale the gpus based on utilization. I find it easier to tune and you can monitor latency and backlogs easier. It does require some async mechanisms to the client but I have found it easier to maintain


I wonder how many audio conversations were overheard and uploaded during that moment.

Seems like an attack vector for forcing your devices to start recording the mic and transmitting it.

Even if it just takes down the wifi through maxing out internet connection or cellular network. Play a couple seconds of audio and gigabytes get uploaded.


If you have optimized your math heavy code and it is already in a typed language and you need it to be faster, then you think about the GPU options

In my experience you can roughly get 8x speed improvement.

Turning a 4 second web response into half a second can be game changing. But it is a lot easier to use a web socket and put a spinner or cache result in background.

Running a GPU in the cloud is expensive


Here is a quick test

The table of contents points to a single Json object that is 20ish gb compressed

https://www.anthem.com/machine-readable-file/search/

All stock libs will fail


I think it is somewhat like git's creation story. Sometimes a senior dev sees a tool that is close to ideal but needs to work a little differently than what the industry has built.

Databases are up there with encryption. Don't roll your own... mentality.

But sometimes they don't fit the problem your solving. Sometimes the data never changes so why have infrastructure for updates.

Having a big DB running all the time could be too expensive for your business model

Also it is good to be curious about "what is an index" and how does a parquet file look in hex editor. Why can't I write the underlying db table outside of postgres. Why are joins hard..

And then you discover your tools give you a competitive edge

Most of the time there are existing tools, but sometimes they don't.


Insurance price transparency can have 16gb of compressed JSON that represents a single object.

Here is the anthem page. The toc link is 16gb

https://www.anthem.com/machine-readable-file/search/

They are complying with the mandate. But not optimizing for the parsers


There are some exciting ways to continue using the heroku build packs and Google cloudbuild to host on gcp. You can maintain git push deploys costs can be pretty inexpensive. The primary cost is around managed postgres.

https://cloud.google.com/products/calculator#id=46eb44ca-9ac...

It isn't the same as a free plan but can provide a very similar ease of use. My profile has contact info if you want more detail.


I enjoy being lefty. My grandfather was trained into being righty except for golf, so I noticed early on how things were not made for me and I had to do extra work. My father uses a lot of power tools and the extra dangers were apparent there as well.

Overall it has probably helped my programming as I had been thinking outside the box for longer than the righties.


sounds like we could use this for user attestation.

And be protected from DNS or BGP hijacking


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: