Hacker Newsnew | past | comments | ask | show | jobs | submit | ryang2718's commentslogin

It can be very device specific unfortunately. Thinkpad tend to work quote well. I had a Framework that my wife took from me and it's truly fantastic, works out of the box.


The key bindings out of the box with something like Doom emacsx is a big selling point too.

I have not been able to get markdown to walk in Vim, anywhere near as well.


I don't remember Vim's Markdown support to be anything special, either; I do a lot of Markdown work, and tended to use Markdown-specific editors on the Mac like Ulysses and iA Writer, while doing my technical writing in BBEdit. (I never found Vim to fit me particularly well for prose of any kind, even though I was pretty experienced with it. Apparently my writing brain is not modal.)

Semi-ironically given the Org mode discussion, the markdown-mode package for Emacs makes it one of the best Markdown editors I've used!


I too have found this. However, I absolutely love being able to mock up a larger idea in 30 minutes to assess feasibility as a proof of concept before I sink a few hours into it.


I find it helpful to view least as fitting the noise to a Gaussian distribution.


They both fit Gaussians, just different ones! OLS fits a 1D Gaussian to the set of errors in the y coordinates only, whereas TLS (PCA) fits a 2D Gaussian to the set of all (x,y) pairs.


Well, that was a knowledge gap, thank you! I certainly need to review PCA but python makes it a bit too easy.


OLS estimator is the minimum-variance linear unbiased estimator even without the assumption of Gaussian distribution.


Yes, and if I remember correctly, you get the Gaussian because it's the minimum entropy (least additional assumptions about the shape) continuous distribution given a certain variance.


And given a mean.


Both of these do, in a way. They just differ in which gaussian distribution they're fitting to.

And how I suppose. PCA is effectively moment matching, least squares is max likelihood. These correspond to the two ways of minimizing the Kullback Leibler divergence to or from a gaussian distribution.


Unless a library developer decides to abort on panic in the `toml`, then i don’t believe you can unwind.


Although, tbf, some libraries are documented better than others.

Also, local llms with an agentic tool can be a lot of fun to quickly prototype things. Quality can be hit or miss.

Hopefully the work trickles down to local models long-term.


And you think an llm can generate code to use an undocumented library? :D


Even documented libraries can be a struggle, especially if they are not particularly popular. I'm doing a project with WiFi/LoRa/MQTT on an ESP32. The WiFi code was fairly decent, but the MQTT and especially LoRa library code was nearly useless.


Sonnet 3.5 fails to generate basic JetpackCompose libraries properties properly. Maybe if somebody tried really hard to scrape all the documentation and force feed it, then it could work. But i don't if there are examples of this. Like general LLM, but with complete Android/Kotlin pushed into it to fix the synapses.


Of course, why wouldn't it? It's a generative model, not a lookup table. Show it the library headers, and it'll give you decent results.

Obviously, if the library or code using it weren't part of the training data, and you don't supply either in the context of your request, then it won't generate valid code for it. But that's not LLM's fault.


> not a lookup table

You can imagine the classic attention mechanism as a lookup table, actually.

Transformers are layers and layers and layers of lookup tables.


If there are open source projects that use said library, then probably yes.


Unless they are not hosted on github, then no :D


How have you found R1? I've been meaning to try it with Aider's Architect mode.

Have you tried the 7b?


I haven't been on here for long, but I must say, I'm really amazed at how friendly this community is! HackerNews really sets itself apart.


I’m not sure what you mean.


Recently there was [a thread](https://news.ycombinator.com/item?id=23883270) on Notable and many commented that they were upset that it was no longer open source.

This is something that actually concerned me and so I started using a lot of `bash` scripts to emulate behaviour that I like in Notable, that way I could use Notable without the fear of being locked it but I could try and emulate its behavior in a way that more suited my workflow.

I put the script I use [up on my GitHub](https://news.ycombinator.com/item?id=23883270) and I recently [made a Reddit post](https://news.ycombinator.com/item?id=23883270). I thought maybe you guys could find them helpful or offer feedback?


I did a similar thing to replace my personal workflow that I'd been using a home instance of JIRA for for some time.

I thought about opening it up, but TBH there's loads of these things already, and the real value for me is that I built it myself and it conforms perfectly to what I need. I looked at a few others, but the overhead of grokking someone else's tool was just too much of a hassle for me.

The ergonomics of these things are really close to perfect when you decide yourself what features you want and how they should be implemented.


I couldn't agree more about the difficulty of grokking somebody else's tool.

The thing is though it took me a long time to get to the level where I could put something like this together and I wish I could have had some sort of guidance earlier on.

That's why I've tried to make my implementation modular, so others can take the things that they haven't figured out from mine and implement it in there workflow.

I should get around to documenting it so others can take replicate, imitate or fork it.


I wish you the best!


I think you wanted to link to your GitHub and Reddit post, but all your links seem to point to the same HN thread. Since I'm interested in the matter, would you mind putting the correct links?


Oh you’re absolutely right, what a silly mistake on my end, I put it up on my GitHub here, check it out:

https://github.com/RyanGreenup/cadmus

The original Reddit post is here:

https://reddit.com/r/commandline/comments/hs7g8r/shell_scrip...


Thanks, and congratulations: this project looks amazing


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: