Hacker News new | past | comments | ask | show | jobs | submit login

It's hard or easy depending on your model (of parallelism) and what you want to do with it.

From a certain POV computers are always doing things in parallel (8-, 16-, 32-, 64-bits at a time, and so on, eh?)

Has anyone mentioned Pi Calculus yet?

https://en.wikipedia.org/wiki/Pi-calculus




Anecdote: for me the actor model has been the most understandable and useful concurrency primitive I've used. Pi-calculus, which was inspired by the actor model, is similarly elegant.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: