Hacker Newsnew | past | comments | ask | show | jobs | submit | more testcross's commentslogin

"Google wants to kill the web" should be the title of this article.


I don't understand why gitlab/github/bitbucket don't provide better tools for monorepo. This is a topic pretty trendy. But there is absolutely no tools helping with control access, good ci, ...


What's missing in these is cross-reference, which is not possible without somewhat established BUILD system (caps "pun-intened") - e.g. like bazel/build, then a source code indexer, etc, etc.

This becomes very critical for doing reviews, since it allows you to "trace" things without running them, apart from many other things. For example large scale refactorings looking for usages of functions, and other examples like it.

Why githab/gitlab/etc. can't do it? Well because hardly there could be one encompassing BUILD system to generate correctly this index.


They can create a standard file format that has to be generated by build system. github is in a pretty powerful position. They can create even a shitty version of it and people will follow.

I've been thinking about a tool like this for a long time. A way to attach to each commit not only the diff in the code, but also the list of places affected by the changes (usages of functions that are modified for example). Then during review we wouldn't have only a stupid diff. We would have a list of place to check to be sure that the changes make sense in the context of the project.


Even if they can, it's one thing indexing your own source files every night, another indexing a much bigger amount + massive amounts of branches, clones, etc. (I'm talking about github) - e.g. not practical - as there is no no clear way to say which branch (from git) must be indexed (obviously not all) - e.g. there is no encompassing "standard" saying so.

That by itself is another BIG PLUS for mono-repo (and "mono"-rules) - things are done one (opinionated) way, trunk based development - but thus giving you things that you won't be able to have normally.

Now indexing source file is not an easy and cheap task - it's basically a huge MapReduce done over several hours (just guessing), so there must be a reason for this to be done.


Most common use case for people using only the text editor part is probably the package manager downloading files in clear text.


There is a crowdfunding going on. It already reached its goal. But there is a second goal at 45K. 11 days to go. Please help!

https://www.kisskissbankbank.com/en/projects/peertube-a-free...


Are you involved with the project? Have they explored any EU grants that may be available for open source and creative development?


While not for the EU... If anyone in the USA is looking for grants checkout http://interkn.com to search for them based on keywords.


No, I'm not involved. Can't tell you unfortunately.


No worries, I will get in touch with the non-profit responsible for PeerTube.


Reminder that PeerTube is doing a crowdfunding to develop the project further. Available at https://www.kisskissbankbank.com/en/projects/peertube-a-free...


Hey. What is the error rate you face? Like, how often it is not possible to render a page because the navigation fails, chrome crashes, a strange exception is raised by puppeteer, ...


At Ahrefs we are running between 130 and 170 millions sessions a day. The project started before the release of puppeteer. I can tell you the release of puppeteer saved me a lot of time. It's way more easier to keep a correct browser state using it than what was previously available. It also interacts well with lighthouse. I wouldn't call chrome headless stable or bug free (it doesn't even handle https correctly). But it's a good thing to have it available.


What issues do you have with HTTPS/TLS? I can’t say I’ve come across any issues so far - is it a specific usecase to Ahrefs?


https://github.com/GoogleChrome/puppeteer/issues/1159

I don't think we have a fancy usage of puppeteer. But still we get between 2% and 5% of errors depending on the version used. It's a small ratio, but at this scale it's a lot of pages.


What kind of hardware is Ahrefs running these headless browser sessions on?


Awesome stats, I've actually run across Ahrefs a few times during my development and purview of other products. Really: well done on that, I hold Ahrefs in high esteem


Those games are always for vim. But I would love to have something like this with emacs using all the transposition commands or what is available from paredit/smartparens.


Even protonmail says the authors were not responsible by letting EFF communicating so strongly...

https://mobile.twitter.com/ProtonMail/status/996006094605570...



Interesting comment on lobste.rs to explain why the piece table is rarely superior to the gap buffer: https://lobste.rs/s/xpab69/text_editor_data_structures


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: