Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If they built something to warn you when you hit the 0.17% (or whatever the magic number is) it would be great, or just retrieve according to the limit for you.


To retrieve all your files at the free rate would take 588 days. Glacier is not a product geared towards consumers, and the free tier is basically negligible for this use case.

You're right that an upfront retrieval speed vs cost interface is the correct way to handle it, preferably on first use.


I'm interested in the service as a consumer. I keep backups of my important files in my apartment, so I consider data loss a very rare but catastrophic event. Glacier is a reasonably priced insurance for my important files.


Exactly, me too. I have multiple backups of my image and video files in my home (~250 Go, growing fast), and I rotate the disks in a vault in the basement. It's very very unlikely that I will lose any file, but it can happen if my whole house burns down or gets robbed (vault included). Glacier is relevant in that case for me as a consumer.


Exactly, I am considering this for backup in case of any sort of catastrophic event. Fire, robbery, etc. Events where a bit of money for data retrieval will really be the least of my worries that day (Might it even be possible to get homeowners/renters insurance to cover retrieval costs? Something to consider maybe.)


This is exactly my use case, and I've written glacier-cli to integrate with git-annex in order to fulfil it.


For family photos and unedited family videos I could probably actually wait that long in the even of a catestrophic loss of all my local copies.

I'd prefer the speed/cost interface to be at download time. Give the opportunity to reorder the download sequence to prioritise some parts and probably transfer to S3 so the local computer doesn't need to be running throughout the whole data trickle.

That's a useful service that I can't be bothered to build at the moment.


Wow didn't realize it was that slow. I think you are right, it is for companies who don't mind paying if they have a data loss incident. It does support AWS import/export so one can fedex them harddrives if someone needs the data fast.


Also, if you are a company that does hourly backups and keeps them for a year, you can grab back 14 of them per day, which is plenty. That seems to be the canonical use case.


AWS free tier has always been designed as a "temprary free sample" not as a "free for small users" like AppEngine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: