I click through every formatter-related post because I am looking for something better than what I have. My team at work wanted to use Prettier, but lack of configurability (and also heinous bugs!) made me switch to dprint[1].
But even dprint doesn't do a couple of the things I want, #1 of which is "For the love of science, please stop deleting the intentionally placed blank line after a hugely long class declaration!" E.g.:
I mean, I know I am a bad person because of those long names, but that is how life goes sometimes! And the blank line there at the top is just very important to like, catch one's breath, while reading this code.
(I'm really just posting this in the hopes that somebody will throw me a "Bro, just use hpstrlnt, it totally lets you configure that!" -- I have not actually tried Rome to see if it does (it's Monday morning and I'm not quite ready to be disappointed again...))
[1]: dprint is good, and I recommend it as the best code formatter I currently know of: https://dprint.dev/
I look at tools like Rome and they might have been relevant 5 years ago.
I think Bun has a better shot than Deno because it has a very fast built in node_modules installer. But I think Bun's goals are too ambitious - trying to be all things to all projects from a single binary. No one cares if you have to use a separate binary, esbuild, for all your bundling needs, for example.
It's amusing how the entire JS ecosystem put up with such slow build times for a decade. If these JS tools teach us anything it's that JavaScript doesn't cut it for performance and a compiled language is the way to go.
I don't get your central point. What is different about now compared to 5 years ago, when it comes to a code linter and/or bundler?
At first glance, I thought you were saying "this Rome thing might have been relevant 5 years ago, but now Bun and Deno exist, so it is not." But, that didn't really make sense because Bun and Deno are both new and growing and there's no clear winner, or even leader, in the "post-NodeJS runtimey programming thingamajig and excution environment".
Re-reading your comment, though, it seems like what you are really saying is "JavaScript itself is slow, so it is no longer relevant".
But when stated so plainly that sentiment becomes absurd, so as a reader I sort of mentally revise my interpretation of it to "JavaScript is slow (yes), so I personally wish people would just switch to faster languages".
But that doesn't really make sense in this context, either, because this tool is written in Rust.
So... what are you saying? What thot r u the thinkr of here?
My guess is that IO latency is the reason why Javascript tools are slow. Creating lots of separate TCP connections to download a tarball and then store it to disk, then decompress it and then write the files it contains. Lots of small files IO slows things down versus one large contiguous file transfer.
If it is IO that is causing slow project builds, then the IO in theory would be slow with compiled tools as well.
Using compiled languages to build UI in a browser is still a disaster. And yes, I’ve tried them all from applets to AS3 to elm to bucklescript to Rust via WASM
That is true. It’s easier to get ESLint and prettier running together than it uses to be, but it is still a bit of a hassle. Prettier needs to be added as an ESLint plugin, and that allows all its rules to be defined in the ESLint config.
Feels like they could stand to merge, like eslint+tslint.
Use it since ~6 month. It is fast and works without bigger problems. But you have to learn and accept, that you can not modify much things like in prettier,. You have to use it as it is. That is the philosophy of their tools.
And after a while it is ok. You realize, that you spend before much time to modify everyhting, that is not really necesary.
3 years of using prettier and I still curse it daily for not letting me have more than 1 empty line to delineate related sections of code in large files for readability.
Add `// -----------------` in between blocks to work around this issue, and IMO provide much better delineation of content blocks e.g. imports vs definitions.
I do miss C#'s use of macros to allow defining arbitrary blocks that can be named and folded though.
It is quite stable at the moment. I would still recommend taking a close look at the changes that Rome suggests, especially for large codebases: I think that some bugs are still expected.
The LSP (VSCode extension) is less stable at the moment.
The tooling itself is in relatively good shape in my usage, although the VS Code extension currently has a number of rough edges (frequent crashes, etc.).
It’s worth a try, but wouldn’t necessarily recommend ‘switching’ wholesale at the moment.
I've got mixed feelings about Rome. There's so much room to cover with ridiculously slow tools today. But I'm sick and tired of these people in the industry dropping their toys because they're tired of working on stuff people actually use instead of just improving what they currently have.
Would it have been impossible to nudge Node.js in the direction of where Deno is today?
Would it have been impossible to replace Babel with a Go implementation?
I also don't want tools that want to be literally everything.
Imagine if Daniel Stenberg was like, "You know what I'm tired of cURL, let me rebuild literally the same thing in another language and give it a new name, and entirely different opts."
> Would it have been impossible to nudge Node.js in the direction of where Deno is today?
Yes, do you not remember the whole nodejs vs. IOjs split? The community is already pulling node in tons of different directions.
> Would it have been impossible to replace Babel with a Go implementation?
Again, yes it would be impossible by dint of the fact babel is made to be a framework where you process JS ASTs using JavaScript. Rewriting it in Go removes the whole ecosystem of JS plugins it built, which is kind of the entire point of the project. A Go rewrite of certain transforms is a new project.
Also Deno (along with Bun) did pull Node in its general direction -- the latter has finally started to ship things like test runners, a watch mode, and Fetch
This. Same happened with npm starting to offer new features (like lockfiles) only after yarn had them. History has shown competition is only way to make them move.
I think, in part, it’s because people want shiny tools and lots of new features but nobody wants their code to break when upgrading to a new version. Such is the duality of backwards compatibility. That leads to lots of churning with people working on scratching their own itches instead of “fixing” existing tools, because a lot of times that fixing implies breaking compatibility and upstream won’t merge the changes.
Moreover, whenever I’ve offered up a solution that scratches my itch, people complain that I should have focused on improving the current thing, but typically the maintainers of the current thing aren’t interested and have different priorities (probably very understandably!). When I can fix my problem by contributing upstream, I do so but it’s often prohibitively difficult.
I think you’re underestimating the politics of open source and the thanklessness of making disruptive changes to something where a large and vocal portion of the user base just wants the thing to keep working as it is. Without breaking compatibility it’s impossible to fix the design mistakes of the past, so the most worthy changes are necessarily disruptive.
I wouldn’t discount all-in-one tooling. Look at a language like Rust — its std toolchain includes a doc generator, package creation, type system, linter, formatter, compiler, dependency management, etc. This makes for a very pleasant developer experience. JS has literally none of that, and yet has more strict runtime constraints. (E.g. if your user has to download your software every time they use it, it better download and be ready to use very fast.)
JS has nothing like that, and it has been cool to see more experiments around this type of approach. Many disparate tools and versions are hard to manage because there are so many permutations of options!
> Would it have been impossible to nudge Node.js in the direction of where Deno is today?
I don’t think anything necessarily prevents Node from being nudged in that direction — and I think we’re seeing the first signs of that with Node 20 — but the implication in nudging is that it takes a comparatively much longer amount of time. Deno has its own set of challenges as a result of it being an independent project with its own sets of breaking differences, but I think the goal with Deno was try to push the JavaScript server ecosystem forward rather than to slowly nudge it forward (with the sizable resistance that entails).
cURL is more singular here in comparison; there’s no rapidly-changing ecosystem that dictates the pace at which it changes or improves, but that’s not the case for the JavaScript ecosystem (see: Node’s many problems in keeping pace with the adoption of modules over the last several years).
The Node/JS/React world is full of thought leaders who gain reputation and clout by releasing new tools/libraries, convincing the masses to adopt them and pay for expensive consulting/courses on them (instead of just having adequate documentation to begin with), and then abandoning it altogether to move onto their next "product" to "sell". Rinse and repeat. It's why if you're not paying attention, JS best practices will seem completely different in one year's time.
Kind of a pessimistic view, but I do think this is what happens.
> But I'm sick and tired of these people in the industry dropping their toys because they're tired of working on stuff people actually use instead of just improving what they currently have.
I have sometimes the same feeling. I often dream of a "united community" working on a "perfect tool".
But there are so many differences in the community. In reality – as in most human systems – people create their own "dream tools". And the success of new tools influences earlier tools trying to catch up.
Because Deno has some traction, this forces Nodejs to catch up. I think Bunjs has accelerated some changes in Deno's Nodejs compatibility.
In the process, a lot of efforts seem lost. However, this is the way humans and communities are working: people learn by imitating others, changes are often pushed by new tools and systems that have more freedom to evolve.
If Node.js changed rapidly towards Deno we'd be back to folks complaining about JS fatigue.
Both Node.js and Deno have strong reasons to exist. Node has a massive installed base that values slightly more stability at this point. Deno benefits from being able to boldly explore big changes.
This reminds me of when npm was very slow and had some issues. So Yarn was awesome. Then npm got fixed and I don’t see a need for Yarn anymore.
There’s a ton of value of everyone using the same tool even if it isn’t perfect for every specific use case. Now install readme files regularly explain both yarn and npm installs and the community needs to be aware of the existence of two basically identical tools.
Yarn 3 is still better than npm these days due to PnP, if you choose to use that, but even if not, it's also much faster at installing node dependencies than npm.
It could have to do with the massive usage of JavaScript.
But, I only see this happening in the JavaScript space.
There seems to be something intrinsically wrong with JavaScript and NodeJS that a lot of the tools used to manipulate the language is written in other languages.
Too many, I believe for the same reasons as Javascript. It's not a bad thing when the target language is simply unfit for developer tooling, but in case of Javascript and Python it's simply the speed. Doing massive AST manipulation in a language that is both slow and unsafe like Javascript just seems like madness, and it's strange that we've been kind of okay with it for the longest time.
This is being pedantic. That's not the point: do you have to literally abandon the project instead of rewriting it and leaving all your users out to dry?
Would you get divorced from your spouse if something minor wasn't working out, too?
You can't rewrite something complex. If you try to rewrite it you'll not achieve feature parity so it will be something different at least when you're building it. And you will perpetually chase your target. You MAY try to build something somewhat compatible starting small (not a replacement) or you can fork and gradually diverge
Waiting minutes for your JS based toolchain, several time each day is not something minor! I learned Reason with ReasonML and got spoiled. Builds there took fractions of a second. First time with TypeScript made we wanna cry. Some teammates disabled the pipeline before pushing, guess what happened.
This is typical FE tech stack engineering. Invent new solutions to solved problems and sell solution as next hot thing. And what proceeds is what causes people to dread FE work.
Create something new is always funnier as maintaining something old.
The other problem with existing and working projects is, that they have maintainer and a community and you can't change things radically. That is a really long process.
And sometimes competetive projects improve each other.
> Would it have been impossible to nudge Node.js in the direction of where Deno is today?
It's not the 1st time NodeJs required "forking". And so what if we do nudge Node yet again? It will just keep repeating itself unless there are bigger changes.
> Would it have been impossible to replace Babel with a Go implementation?
More like the world has changed since IE left the chat. We need less / different set of polyfills.
* Typescript built in. You might dismiss it as easy, but it's only easy in the sense that it's not a lot of work if you've done it. It's still a massive faff and paper cut. Whatever the next thing up from a paper cut is. Toe stub?
There's this incredible phenomenon on this website where someone posts "Show HN: <tool>, but modern" where <tool> is something used by nearly every Linux machine in the last 40 years, and what makes it "modern" is Rust, inexplicably.
Is there a JSON formatter out there that can be configured specifically for numeric arrays ? I'm specifically looking for something that formats 1D arrays in one line and 2D as one line per row.
fracturedJSON gets close but not exactly what I want.
What I have heard and seems was considered "common knowledge" about a year ago was that the founder embezzled the money. His VCs considered suing, but decided it was just too messy. I'm not joking, something about some seriously extravagant vacations and a house renovation all paid right out of the company's bank account.
With payroll taxes and benefits, an employee is ~2x TC. Since startups underpay frequently, let’s say the average salary was ~150k in TC for early employees within SC. That’s 300k*5 = 1.8 million gone per year. Over two years that’s 3.6M. 1.4M left over for finance, legal, rent, and other expenses is kind of tight and you definitely won’t make it a 3rd without any kind of customers or other financial infusions. That’s why a company has to find some kind of product market fit quickly - you have to show there’s a there there so you could continue growing either through revenue growth alone or raising another round.
2x TC seems high. For one direct example I know of, benefits package (including all the insurances with all premiums paid by the employer) costs around $25k/yr. I feel like at $150k salary, it’s probably like $200k total cost? Payroll tax employers are responsible for seems around 7-8%. $30k benefits, $12k extra payroll tax leaves $8k for equipment and other expenses. That would be about $1M/yr for 5 employees.
I think the 2x number is a rough ballpark that includes total cost of employment (eg HR salaries / outsourced HR etc). It’s entirely possible that this overhead is smaller for startups and it may have shifted over time (this is a rule of thumb I was told by a few people 10 years ago)
Is not an official source, but it seems that it is the case according to this discussion[0], searching in the social media accounts there's nothing, also Sebastian[1] didn't published anything more about Rome since December
Coming from using yarn, it is way easier to move to pnpm than to yarn 2+. It’s now my go to. The only issues I find is that you’ll still not find good community support for it. Things like Dependabot and some edge hosting platform build agents.
For me there was always problem with pnpm cache working together with vite when doing pnpm link, and yarn link works perfectly, so even though pnpm is much faster, I went back to yarn.
Don't think this is true, yarn 2 is a drop in basically you may be thinking of the PnP stuff but that's off by default now. pnpm used to be the difficult one but seems a bit better these days.
From personal experience introducing pnpm into a big JavaScript project my very subjective(!) view of the situation:
pnpm offers fast installs and the best dependency management capabilities I have seen.
However, there is a steep learning-curve attached to the latter. You cannot just task a random developer with fixing emerging problems or else you will be left with half-a-dozen large and unintelligible config files and hacks.
If you are the type of team that never updates dependencies, it is not worth the effort. If you are the type of team that applies the heuristic of "yes" to the question "Should I download a dependency?", it is not worth the effort.
On the other hand, if you value build tool performance, if you are updating frequently and if your dependencies are carefully curated, pnpm is great and with Node >16 it is just a...
Rome has not much rules, that is their philosophy, you should not waste much time with every single configuration option and use it as it is.
https://docs.rome.tools/lint/rules/
Ok, not having much rules is fine, but not allowing me to create my is really a downside compared to eslint. Creating arbitrary static checks is really super powerful and actually useful to overcome limitations of what can be done only with TS typing.
Eslint can get VERY slow on large apps. Rome seems to be the perfect answer to that, but I can't seem to find a way to port/import Eslint rules into Rome.
For now, Rome implements most of the ESLint recommended rules (including TypeScript ESLint) and some additional rules that are enabled by default. In the future, you can expect a recommended preset that is a superset of the ESLint recommended preset. So if you're not heavily customising ESLint, you should be able to use Rome.
Otherwise, most of the rules are not fine-tunable in the way that ESLint is. Rome tries to provide the experience that Prettier provided in the formatting landscape: good defaults for a near-zero configuration experience. It tries to adopt the conventions of the JS/TS community. Still, some configuration is provided when the community is divided on some opinions (e.g. space vs. tab indentation, semicolons or as-needed semicolons, ...).
There is an open issue [1] for listing equivalent rules between ESLint and Rome. Expect more documentation in the future, and maybe a migration tool.
If I had been one of the founders of Rome, I could have pushed for more compatibility with ESLint. In particular, using the same naming conventions and thus the same names for most rules, and recognising ESLint ignore comments.
That’s because it only supports a very small subset of lint rules. As best I can tell, the rules are very opinionated often in weird ways and not particularly as complete as ESLint.
It works as it should, as a user you have to accept that rome works in a specific way.
If you can do that and live with that, rome is a excellent tool. It took me some weeks to accept the "rome way" and now i am more than happy with that and did not waste time with endless configurations of eslint/prettier.
Rome is less opinionated than it used to be. However, I admit it is still more opinionated than ESLint. It is part of its ADN.
However, Rome is also trying to provide a smooth experience. And we are open to relax some rules if it makes sense. If you have some time, we would like to get to know the rough edges of Rome.
Makes sense. In 0.x.y releases, any version changes are allowed to have backwards-incompatible changes. Authors typically use "minor" version changes to indicate this.
It's not exactly unusual to go to non-zero major versions that way, especially if it's been at 0.x for a while and people are thinking/talking about those versions. No confusion between 0.16 and 1.6 etc.
Why write something like this in Rust since it presumably doesn't have anything memory intensive or real time going on? Rust's manual memory management is a necessary thing sometimes, but it is a pain, right?
Ha - I was thinking the exact opposite. I'd have figured that tools that require an understanding of an entire project (imports/exports/types/etc) would need be VERY memory intensive, and something that would actually benefit from manual management/rust.
I don't know much about compilers/formatters/linkers and how they work though, so I could easily be wrong.
For some embedded applications, fixed memory regions work fine. For something like a compiler, not so much. Remember that you have to deal with chunks of data of unpredictable size. Even string literals might occupy megabytes. You could program your way around that, but most of the time these days, it's ok to burn memory (i.e. use GC) for the sake of development velocity.
But even dprint doesn't do a couple of the things I want, #1 of which is "For the love of science, please stop deleting the intentionally placed blank line after a hugely long class declaration!" E.g.:
I mean, I know I am a bad person because of those long names, but that is how life goes sometimes! And the blank line there at the top is just very important to like, catch one's breath, while reading this code.(I'm really just posting this in the hopes that somebody will throw me a "Bro, just use hpstrlnt, it totally lets you configure that!" -- I have not actually tried Rome to see if it does (it's Monday morning and I'm not quite ready to be disappointed again...))
[1]: dprint is good, and I recommend it as the best code formatter I currently know of: https://dprint.dev/