Even a cursory glance at the runtime performance difference between these two frameworks reveals that either this project won't scale to the point that cloud costs are relevant or they have a dubious prioritization of DX over deployment economy. We are talking orders of magnitude fewer RPS for Rails.
I don’t understand your integration of performance and cloud costs here.
“Deployment economy” is also new.
Rails has a very strong track record of matching internet scale.
Cloud is highly optimized for traditional server applications. From my experience with Next.js - this is the opposite. A lot of deployment components that don’t naturally fit in, and engineering required to optimize costs.
Quite simply: at certain threshold counts of users you will be forced to add many more cloud instances/pods running Rails than you would need running node.js (or Java or go or many others). But it doesn't stop at instances because this will also require more persistent disk / object storage, more logs, more alerts, more notifications from the cloud provider that instance xyz needs to be restarted (due to a firmware upgrade or whatever), etc. etc. All of these have human management overhead costs and most of them increase monthly financial costs.
It's less expensive now with Rails than our hosting was with Next.js. If there was more traffic, we'd save even more money in comparison. That was mentioned in the post.