I recently read a short piece about using Oracle’s always free as a VPN. What other cool projects have you run out of any cloud/hosting provider’s always free offerings? (Pour one out for Angelfire and Geocities)
Could definitely have used Dynamo as a persistent layer, but the PNGs are transient - the definition is stored in a GitHub gist owned by the user (another free tier! :). But generating the PNGs is expensive, so they get cached in S3. An S3 lifecycle rule automatically prunes PNGs older than a month, and they'll get regenerated from the gist if needed.
A month seems like a long time, I’d have assumed you could probably delete 95% of the images within 3 days. Out of interest is the month long expire rule a guesstimate or something based on actual numbers?
It's just a number, I haven't looked at any request logs to see what hit rates are like.
From a money perspective, expiring them never would be fine, and a good trade off to get better latency on subsequent loads. However, if the underlying gist is deleted, I'd like the PNG to eventually go away.
Don't know if this qualifies as cool but I built a single purpose site to list IMDb ratings of all episodes of a TV show in a grid, which runs on GitHub Pages. It doesn't use any external APIs.
I like the simplicity, nicely done. On the homepage there is a giant empty spot on the right, not sure if that was intentional or something isn't loading. If there's nothing there now, maybe a screenshot of a sample show would be a nice way to introduce it.
Thanks mate. You're right, on big screens (< 5% of users) the empty spot is a bit of a concern. I'll try to add a gif of a demo. Honestly, this was a build, automate and forget type of weekend project. So I forced myself to stop adding features but can certainly find some time to improve it a bit.
I saw the network tab while searching, and no API calls were made for searching. If possible can you elaborate on how you made the search like this possible?
Also loved the minimalist design and also how you can see your recently searched terms at the bottom.
Thanks mate. I wanted to build a no nonsense site and think I've achieved my goals for the time allotted for a weekend project.
The architecture is dead simple. Multiple times a week IMDb publishes bare bones datasets. I've a bash script to download them, format and load into MySQL from which I export two types of json files:
1) a file with all the TV shows names, id, ratings etc (shows.json) - this is what's used for search. It weighs 2MB compressed and I could certainly optimize but considering the low traffic I've stashed it for later time.
2) A file for every tv show with all the ratings and votes for its episodes. Based on your search, the specific file will be fetched to display ratings. This one file per show could also be optimized but looks premature at this stage.
Strictly speaking, a database is also not necessary but it serves 2 purposes: 1) I could query easily to satisfy some curious show related questions. 2) The datasets include a ton of stale data (like shows w/o episodes and vice versa), so I find it easier to cleanup through SQL.
Thank you. That's precisely why I built this. Yesterday I wanted to catch up on Season 3 of Grand Tour and guess which site I went to see what to watch!
Thanks for the kind words. The use case you mentioned is the primary motivator for the site. That's especially useful for shows you want to revisit or new ones where sequential viewing isn't necessary (like Top Gear etc.)
Thanks mate! Interestingly,The Simpsons was the show that sort of inspired me to build this site. I saw an image on reddit showing all the Simpsons episode ratings and thought it would be a fun idea to build a website based on the concept.
I run a free GCP micro VPS and also a free Oracle Cloud VPS. I find it useful for running very low volume web app experiments and for light coding from portable devices (good Mosh+Emacs+tmux setups possible for iPhone, iPad, and Android devices).
Also, I use Leanpub to write my books [1], which is free to use (and they pay good royalties). A recent hack: I used to supply free downloads for my eBooks on https://markwatson.com (which I run on a free VPS) but then realized that by setting the minimum price for my Leanpub books to $0.00, I didn't even have to do that. About 50 people download a free book for everyone who chooses to pay, so I hope Leanpub is not losing money on me (they are nice people).
The MetaMask Chrome extension doesn't let me visit your site because it seems to be listed on a list concerning "Ethereum Phishing Detection".
Haven't seen this before, just thought to let you know.
It's a Gatsby website and I used to use Vercel with their free deployments, but because I also needed a subdomain with PHP I switched to my cheap $3/month (unlimited website) webhost to serve the static files; but I also use Cloudflare for their free CDN. And the app demos are hosted on GitHub through Releases. The storefront is "free" through Chec.io in that I only pay a small fraction of every transaction.
In running OnlineOrNot (https://onlineornot.com/) which itself has a generous free tier, about 5 days of the month I rely on AWS's free tier for AWS Lambda, I managed to run on Vercel.com's free tier with Next.js until around my 8th paying customer, and my business update newsletter relies on ConvertKit's free tier.
I have a small mailing list for a few selected friends running on heroku (hosting a small website and then using the scheduler plugin to fire emails) + sendgrid free tier.
I also have a dynamic website using a python CMS running on free heroku which the admin uses to update the website. An heroku scheduler run a netlify job every 6 hours which download the dynamic website and publishes it as a static website.
- Cloudflare to avoid big bandwidth bills from AWS for the wasm-compiled graphviz that is on every page
- AWS Lambda's 1M free requests and 400,000 GB-seconds for expressjs (web serving) and graphviz (rendering hostable .png images)
- API Gateway's 1M free requests for routing HTTP -> AWS Lambda (for the app) or S3 (for hosted PNGs)
- S3's 20K GET requests/mo (for hosted PNGs)
The only one I exceed is S3, but it's dirt cheap.
[1]: https://sketchviz.com/new