Hacker Newsnew | past | comments | ask | show | jobs | submit | sfpotter's commentslogin

Why are you asking HN? Do you have anyone you know in your personal life you can talk to? Family or friends? Hopefully someone you know well can give you some honest feedback and help you figure out what would be a better fit for you. I don't see how a bunch of strangers on HN could possibly help seeing as they know nothing about you and totally lack context.

Sometimes anonymous strangers in similar situations help more than either:

1) close colleagues you might not want to be vulnerable towards, or

2) people you are close to and feel vulnerable to you, but do not have similar career trajectories or experiences.

There is a reason why "communities of practice" have always existed, and HN kind-of-sort-of happens to be one.


Maybe they are looking for people who've experienced transitioning from tech to something else. Even if they know lots of supportive people in real life, it's possible none of those people may have gone through this process.

IMHO, sometimes strangers can offer advice without bias, whereas those closest to you cannot.

It’s 2026, why bother asking people when AI exists. /s

On a more serious note, there’s nothing wrong in asking strangers something like this. Plus asking here doesn’t preclude the possibility of also asking to friends and family.


Why would you say Rhino 3D "isn't CAD"?


Well, it's certainly not parametric CAD -- it's a drawing program that happens to be in 3D, with limited (and, I'm very glad to see, growing) ability to use history for some more structured creation. But the biggest limitation is that it's numerics are mediocre, and subtle -- everything is in float space, and it's very easy to get into a space where things just don't make sense, especially far from the origin. In a CAD tool I'd expect to be able to enforce constraints to resolve this ("these two points must match"); I've been able to do that somewhat with my plug-in when the precision is there but the error stack-up has been too high, but there's also cases where the precision just doesn't exist.


You're in luck! As of a year ago, I work at McNeel on the math team on Rhino's in-house CAD kernel. Luckily, we own the entirety of the kernel, so we are free to improve it. I've been in the field of numerical methods for most of my career in academia and industry, so you are preaching to the converted when you say that Rhino's numerics are mediocre. At McNeel, I'm actively pursuing strategies to improve this situation, although it will be a massive long term project.

Hopefully you know that you can reach out to the McNeel developers directly and on the Discourse forums. But I would also love to chat directly if you're interested. It sounds like you're working on a project that is both sophisticated and interesting, which directly stresses many of the known pain points in the kernel. If you're interested, I can shoot an email to the address you've got listed in your profile from my McNeel email.


I think this definition is a bit too strict. CAD just means computer aided design. Architects use Rhino to design buildings. You use it to design airplanes. CAD doesn't even have to be 3D.


You could design entirely in notepad.exe, and that would be computer aided design by your definition.

Rather than that definition being too strict, this one is too litteral.

It was perfectly reasonable to characterize the tool as not really CAD, even though a 3d drawing/modelling/rendering/visualizing program is on a computer and is part of a design process.


I'm not gonna argue with you. That's just silly. Have a read here if you want, but I suppose you're just trolling:

https://en.wikipedia.org/wiki/Rhinoceros_3D

https://en.wikipedia.org/wiki/Computer-aided_design


Again with the definitions. That thing you just did was what everyone else calls arguing.


I made a conscious choice to work at a small company rather than a large company because of this. The politics are smaller, but that doesn't mean the stakes are.


C's support for modularity is actually rather strong. This PDF gives a good overview of the basic techniques available: http://www.metamodulaire.org/Computing/modular-c.pdf

It may not be immediately obvious how to approach modularity since it isn't directly accomplished by explicit language features. But, once you know what you're doing, it's possible to write very large programs with good encapsulation, that span many files, and which nevertheless compile quite rapidly (more or less instantaneously for an incremental build).

I'm not saying other languages don't have better modularity, but to say that C's is bad misses the mark.


The fact that it does appear to be so difficult to scale things up would suggest that the argument isn't silly.


It may well be. Books have tons of useful expository material that you may not find in docs. A library has related books sitting in close proximity to one another. I don't know how many times I've gone to a library looking for one thing but ended up finding something much more interesting. Or to just go to the library with no end goal in mind...


Speaking as a junior, I’m happy to do this on my own (and do!).

Conversations like this are always well intentioned and friction truly is super useful to learning. But the ‘…’ in these conversations seems to always be implicating that we should inject friction.

There’s no need. I have peers who aren’t interested in learning at all. Adding friction to their process doesn’t force them to learn. Meanwhile adding friction to the process of my buddies who are avidly researching just sucks.

If your junior isn’t learning it likely has more to do with them just not being interested (which, hey, I get it) than some flaw in your process.

Start asking prospective hires what their favorite books are. It’s the easiest way to find folks who care.


I’ll also make the observation that the extra time spent is very valuable if your objective solely is learning, but often the Business™ needs require something working ASAP


It's not that friction is always good for learning either though. If you ever prepared course materials, you know that it's important to reduce friction in the irrelevant parts, so that students don't get distracted and demotivated and time and energy is spent on what they need to learn.

So in principle Gen AI could accelerate learning with deliberate use, but it's hard for the instructor to guide that, especially for less motivated students


You're reading a lot into my ellipsis that isn't there. :-)

Please read it as: "who knows what you'll find if you take a stop by the library and just browse!"


I admire your attitude and the clarity of your thought.

It’s not as if today’s juniors won’t have their own hairy situations to struggle through, and I bet those struggles will be where they learn too. The problem space will present struggles enough: where’s the virtue in imposing them artificially?


This should be possible online, it would be if more journals were open access.


Disagree, actually. Having spent a lot of time publishing papers in those very journals, I can tell you that just browsing a journal is much less conducive to discovering a new area to dive into than going to a library and reading a book. IME, books tend to synthesize and collect important results and present them in an understandable (pedagogical?!) way that most journals do not, especially considering that many papers (nowadays) are written primarily to build people's tenure packets and secure grant funding. Older papers aren't quite so bad this way (say, pre-2000).


I've done professional ghostreading for published nonfiction authors. Many such titles are literally a synthesis of x-number of published papers and books. It is all an industry of sorts.


I think I don’t disagree. Only, it would at least be easier to trace the research concept you are interested in up to a nice 70’s paper or a textbook.


> It may well be. Books have tons of useful expository material that you may not find in docs

Books often have the "scam trap" where highly-regarded/praised books are often only useful if you are already familiar with the topic.

For example: i fell for the scam of buying "Advanced Programming in the unix environment" and a lot of concept are only shown but not explained. Wasted money, really. It's one of those book i regret not pirating before buying, really.

At the end of the day, watching some youtube video and then referencing the OS-specific manpage is worth much more than reading that book.

I suspect the case to be the same for other "highly-praised" books as well.


You could make much the same observation about online search results.


I had a completely different response reading the sentence. I've been programming in C for 20+ years and am very familiar with exactly the problem the author is discussing. When they referred to a "static variable", I understood immediately that they meant a file static variable private to the translation unit. Didn't feel contrived or made up to me at all; just a reflection of the author's expertise. Precision of language.


What's the use case for a language like this?

I used to very down on C++ but have stopped caring quite so much... Just using C++ and restricting oneself to templates seems like a better bet than this. Or you could use D and have a language whose template experience is much better than C++'s...

Any language this is going to need debug info eventually. One could step through the generated C code, but this is much less pleasant than stepping through the original source.

I also wonder how name mangling is handled?


For me, it was just to have some fun seeing whether you can get the convenience of generics in C without blowing up the size of a "minimal standards-compliant compiler." E.g., Chibicc[1] is only a few thousand lines of code; adding Gamma to that would not blow it up by much. There's something aesthetically pleasing about knowing I can read the whole thing in a few days. Nothing like that is possible for C++ (or D?) AFAIK.

But yes --- for a real project I would absolutely recommend someone use D over this !

[1] https://github.com/rui314/chibicc


Totally fair. Just wondering if there was some specific motivation for being able to do this... "For fun" is valid, IMO. ;-)


Networking involves more than just letting people know you exist. I'd say that's borderline useless. Actually networking requires building real relationships with people. For me, that means continually meeting new people who do the same kind of thing that I do, having pleasant or exciting conversations with them, learning as much as I can about them (showing a real interest! asking serious questions! listening to their answers!), and demonstrating to them that I'm hungry and I want to do Big Things. It's hard to do this effectively. I'm sure it depends on your field and it certainly requires continual practice.


In general, going to a single networking event with the purpose of networking is kind of silly, but going to the same conference year after year to see the same people and have deep discussions opens a lot of doors. I imagine the point of "networking events" in general is to be a modern take on a country club: You go to see like-minded people who want to meet people like them, and you keep going over and over again to develop relationships.


> Actually networking requires building real relationships with people.

Yes! I'm sorry, I didn't mean to imply that letting folks know you exist was sufficient; it is only necessary.

I find a cheat code to building relationships with people is to give first. I love to ask "how can I help" when I meet someone at a conference or networking event. This does a few things:

* separates you from so many other people who go to these events looking to be transactional

* shows you can follow through (when you actually do help them) which, somewhat shockingly, distinguishes you from many other folks

* filters folks that might not be a fit for a deep relationship because you move in different worlds; if someone asks "well, I am looking for a major piece of real estate to buy", I as a software developer am unlikely to be able to help them

This is a long play though, to be sure.


Fun fact: some spectra are discrete, not continuous! And some have both parts. Depends on the operator...


Autism researchers talks in terms of "graded membership" in "fuzzy clusters" within trait space.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: