Hacker Newsnew | past | comments | ask | show | jobs | submit | symstym's commentslogin

But remember, there's no such thing as a 10x engineer! /s


You may appreciate this poignant sci-fi short story/video that references and expands on the story of Alex: https://vimeo.com/195588827


Looks like the Ted Chiang short story? Thanks - it’s great.

I’d recommend his other stories too if you like that one.

I’ve also got a bunch of links to other stories I’ve liked here too: https://zalberico.com/about/



Zig seems to have arbitrary compile-time code evaluation, but not the kind of AST generation you see here[1]. Nim macros seem to be a closer analogue[2].

[1]: https://github.com/seanbaxter/circle/blob/master/gems/rpn.md...

[2]: https://nim-lang.org/docs/macros.html


There's ongoing work to add support for generating arbitary types at compile time: https://github.com/ziglang/zig/issues/383.


Sounds cool. If you haven't already, you might want to check out the Fractal Bits iOS app. It doesn't let you manually adjust parameters, only randomly generate sounds, but it makes some nice ones. It sounds probably FM-based to me.


I’m really going for a purposeful tool here that you can use to get what you want. Something a bit more than SFXR but a lot less than Reaktor.


I think this is a much better introduction than TLA, thanks.


A while back I found out that the popular Serverless framework/library tracks and reports back usage (https://serverless.com/framework/docs/providers/aws/cli-refe...). This similarly struck me as really out of place, and (at the time at least) it didn't seem sufficiently disclosed or described in the docs. If I NPM install it and invoke it, have I implicitly agreed to this?


Nice! Have you seen this similar project? If not sure if there are any ideas to be borrowed, but it sounds like it achieved excellent accuracy: https://github.com/AlbertoSabater/subtitle-synchronization


Interesting! I was not aware of that project before. It looks like it could be worthwhile to incorporate their neural net-based VAD, if it's not too slow. At a first glance, it looks like the main difference is the postprocessing step -- they use some heuristics to avoid trying all possible alignments, while subsync uses FFT as its secret sauce to get away with trying out all of them. :)

I should include in the readme that for >1 hour movies, it usually finishes the synchronization in ~20 seconds, which compares favorably to the project linked in your comment (13 minutes unoptimized, 2 minutes optimized).


This reminds me of some parts of Pieter Hintjens' book Social Architecture. He advocates that open source projects practice "optimistic merging" where essentially any patch that is well-formed should be accepted, without value judgement (though they may be reverted later). He argues that counterintuitively, this leads to better results than having gatekeepers. It seems pretty extreme, but similar in spirit to the idea of permissionless innovation.


I think this attitude of "we solved these problems decades ago!" is rather naive and sometimes arrogant.

I think it's a fantastic talk, and that Bret Victor and Alan Kay are geniuses. But I feel that they both promote the idea that we definitively solved all these important computing problems years ago, and that people are just too clueless/resistant to catch on. Yes, I agree that many good ideas have been culturally "forgotten". But for the most part, the reason these great past ideas are not in use is because nobody has made them into a compelling product.

Their attitude is comparable to someone saying "oh I invented the WWW in 1985 but nobody would listen to me", or "oh I invented Twitter before Twitter but users weren't enlightened enough to appreciate it". Almost all good ideas were already had before, but they are worth comparatively little, and unlikely to catch on, until they are reified into something that people want to use.

I agree with them that probably more people should be working in certain areas (e.g. new ways of programming). But if they really had it all figured out, then why haven't they themselves made the amazing new programming language that we all use? What if it's the case that some of their ideas are good in theory, but are hard to translate into a usable product? Most people accept that execution>>idea in the world of startups, but don't acknowledge that the same may apply here.


> But for the most part, the reason these great past ideas are not in use is because nobody has made them into a compelling product.

I think you're overestimating the market's ability to select good ideas and specifically, promote long term scientific advances. A lot of variables affect what succeeds, e.g. marketing, coincidence, network effects etc.

You cannot leave everything to the market (i.e. what people adopt) and expect great science to come from it. Many times, great science (and maths) comes from the compelling drive of people to discover and create something new. These talks are encouraging that kind of research, and I don't see them saying they 'have it figured out', but rather pointing out ideas they think should be explored more extensively.


I certainly don't think that the market selects good ideas or is sufficient to promote long-term advances. Per the bit of my comment that you quoted, I'm saying that the reason that the ideas are not in use (more) is that they haven't been incorporated into more compelling products. He feels that good ideas are not in wide use because we didn't "get" them or forgot them. I'm saying that sure, that may be part of the reason, but I think most of the reason is that certain ideas that seem good on paper are really hard to put into practice.

Quoting the talk:

> But I do think that it would be kind of a shame if in forty years we’re still coding in procedures and text files in a sequential programing model. I think that would suggest we didn’t learn anything from this really fertile period in computer science. So that would kind of be a tragedy.

Lots of people seem aware of the idea of coding without text files. There are some "visual" programming environments with traction (in the game dev world, Max/MSP). I'm even working on one myself! But there are significant downsides/challenges associated with this approach (more difficult to version control, often tied to one editor, etc.). So to his quote, the fact that we're still coding in text files may not be because we didn't learn anything, but because the idea of non-textual programming is hard to form into a product that more people want to use.

I agree with you that his talk is very valuable in terms of drawing attention to ideas that deserve more exploration, and I love the talk. It's just this one facet that I take issue with, the suggestion that the ideas haven't caught on because nobody appreciates them. His talks are frequently at the top of HN, they are widely appreciated. People have been super excited about related projects, like Light Table and Eve, and yet they haven't gotten much traction. So I think it's worth acknowledging that the problem is less idea-awareness and more compelling-implementation-difficulty.


Agreed. I fall into this trap more often than I'd care to admit. But I've comforted myself into thinking this is not limited to our field. Look at folks convinced that Romans knew how to make better concrete. Or that Damascus steel is somehow beyond current capabilities.

Are there gems that were lost in the past? Almost certainly. Could we have done better by not getting obsessed with constantly rewriting things into new languages? Debatable. My money is on not. And I hate js.


> comparable to someone saying "oh I invented the WWW in 1985 but nobody would listen to me"

I hear there's a guy that invented EMAIL..


If you're saying you think the talk was arrogant, I think that's a bit of an exaggeration. I just found it funny. I thought the whole format of posing as a computer engineer in the early seventies seemed clever. Parts of the talk were a bit snarky, but not Linus-Torvalds-snarky.


Timing can also play a huge role in the success of an invention or the widespread adoption of an idea.


I'm not sure this is a constructive comment, but I just wanted to say that you are completely correct, everyone else is crazy, and I am baffled at how much confusion there seems to be over this topic.

One thing to add: Wikipedia sayeth "the eye senses brightness approximately logarithmically over a moderate range". If we go with that, then you presumably want to encode brightness logarithmically, and the number of bits you have available will determine the ratio between your adjacent quantized levels. In that case I believe the ratio between adjacent levels would be exp(ln(max_range_ratio)/(2^bits)).


You don't want to encode brightness perceptually until you're showing it to the user. You need a linear space to actually do lighting calculations, which means physical luminance values. Within that space, you can use whatever scale you wish, with consequences to banding and quantization artifacts for decimating the bit depth. Tone mapping includes conversion to a log space via the gamma curve.

I explained it in another comment but basically the "20 bit" value is based on an idealized digital image sensor rather than a game rendering pipeline (which operates in floating point). It has admittedly proven somewhat confusing.


It's worth noting explicitly that floating point numbers let you have a linear scale but logarithmic-ish storage at the same time. A 12-bit float could represent a 1:1000000 range, let you perform normal linear math, and also be just as banding-free as a 20 bit integer.


That's why even the good old 8 bits per channel images are encoded logarithmically. Enter the gamma curve: https://en.wikipedia.org/wiki/Gamma_correction


no - the REASON why we use gamma is deeply historical and has essentially to do with trying to build TV receivers with the minimum number of vacuum tubes


Is it just a coincidence that the gamma curve also corresponds roughly to our eyes response to light? Awfully lucky if it was.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: