Hacker Newsnew | past | comments | ask | show | jobs | submit | glompers's commentslogin

Longtime HN user morphle, who commented elsewhere in this thread, has researched and designed chips and hardware for that purpose (edit: for scaling that form of computing). He has been trying to find funds and partners to bring them to market.

Disclaimer: never met or spoken or worked with him


Morphle here. Thanks for mentioning our work [1]. We still seek funders and students we can teach.

We build on the shoulders of the generation that built Smalltalk but just retired. There is a huge amount of documentation, science papers and talks on how to implement it all.

We are starting to implement the final step: An autonomous European secure operating system running a hierarchy of Virtual Machines (message passing parallel bytecode Smalltalk, Lisp, Erlang) and Qemu VMs with a modern GUI in less than 30.000 lines of source code that can be fully understood by a single person.

We improved our hardware to a manicore European Morphle Engine processor architecture that can run microcode, bytecodes, X86, Arm, Risc-V and other Qemu supported processors faster than the native chips.

We have some funding from Ukrainian drone innovators that need cheap computer chips manufactured in Europe, not controlled by the US or China.

We hope the European Autonomy movement away from US Big Tech clouds, operating systems, surveillance and chips with political kill switches and backdoors built in will fund our operating system and app software.

[1] https://www.youtube.com/watch?v=vbqKClBwFwI&t=2s


is there a shorter overview? that video is a 1.5 hour talk followed by a few more hours of discussion.

The entire County of London[0] had an average population density of 60 people per acre (38,400 per square mile) in 1911 and 42 per acre in 1961.

60 per acre being averaged over nonresidential land uses meant that it was still common to find residential densities higher than 40,000 people per square mile (15,000 per sq. km) at that time. Only Tower Hamlets and Islington remain around that density to this day.[1]

[0] https://en.wikipedia.org/wiki/County_of_London

[1] https://en.wikipedia.org/wiki/List_of_English_districts_by_p...


Connections to HS1/Europe, and to Leeds, Golborne, East Midlands, Manchester and finally even Crewe have all been cancelled so now extra expenditures will focus instead on Euston Station. That's not the large section people were interested in riding. Perhaps Old Oak Common should instead have been tunnelled the same distance through to Waterloo International (whose international platforms are now deleted).


The international platforms are not deleted! They were brought back into use from 2018-2019 to serve the Windsor Lines, which includes the service to Reading - platforms 20-24. That somewhat reduces the congestion at Waterloo; the station throat limits adding more services.

The extension to Euston was supposed to have 11 platforms. Even the reduced scope now being implemented is 6 platforms, I believe. All 11 were required to handle the eastern leg of HS2 [providing bypass capacity for the East Coast Main Line out of King's Cross and the Midland Main Line out of St Pancras], and services to Scotland and Manchester [bypassing the West Coast Main Line from Euston's classic platforms].


The Assad dictatorships in Syria and the Hussein regime in Iraq were proponents of Baathism. The former had occupied Lebanon and invaded Israel while the latter had invaded Iran in 1980 and annexed Kuwait in 1990.


Orators learned the "palace of memory" trick for remembering long speeches. In that same vein, then, it does seem less demanding to simply be able to see where you put things.

Whether that's done by walking around, or just by glancing around on a 3D overlay (as suggested above for the Vision Pro), I like neither to have to search through stacks or folders of icons, nor to use Spotlight search fields. But perhaps the different types of cognitive loads result in what some people call different personal organizational styles or preferences. The "Clutterbug"[0] quadrant taxonomy comes to mind.

[0] https://clutterbug.me/what-clutterbug-are-you-test


Good points.

There really is an amazing untapped space of ideas on how to better navigate information.

Even in 2D interfaces, simple things: folders that looked fatter log-relative to how much they contain would add useful context and associative cues, and positive/subjective feelings of "real" recognizable locations vs. just a recursive "interface", when tap-tapping through folders.

An idea I implemented:

I hide a ".home" (zero byte) file in macOS folders I view as being at the top of a folder hierarchy. Then created a button in the finder toolbar that looks like a house. I can drill down a few folder layers, then pop right back to the hierarchy top by clicking the house button.

Just a simple thing. Ordinary users would understand the value of designating "home" folders. And once you have it you can't live without it.

For 3D:

I think traversable "Spaces" on screens were a great interface idea, done half way. and ripe for 3D extension. A space should be something that can be named, opened, closed, opened on another synced device, opened two years later. Duplicated or branched. I.e. a living persistent active project state of an open work state. With sub-spaces, for sub-projects, that can quickly be zoomed in and out of.

The latter would magnify the benefits of working on many different projects in a 3D environment, where having many things open and visible is really helpful, but laborious to continually reconfigure.

How nice to go into a rabbit hole on something important but not urgent, and be able to come back a year later to the same information still visibly organized where you left it. No context lost.

If there is an obvious "Minority Report" type power-user interface to be had, it would be that. Quickly navigating between presistent project/activity interface layouts with gestures. High value, high friction removal, with very low-bandwidth user direction needed.


Did having such a person in charge make a qualitative difference in the atmosphere of how work proceeded among people there?

If so, do you think it would have played out similarly if the organization had had an equally effective "glue person" who wasn't in charge (therefore didn't have any authority to delegate or divide most tasks) and was required to manage upward [sic] to coordinate things for people?


I'm not sure, mostly because it's hard for me to feel confident in figuring out what to attribute to that versus other facets of how he did things. Overall, I have an extremely positive view of how he ran things, but I also personally found him great at a lot of the other things that go into a technical leadership position (making good decisions about what to prioritize, having a consistent vision of what our long-term goals were, not falling into the trap of micro-management, having enough technical skill to be able to help out with the higher-level issues while still respecting the areas where others were more knowledgeable, going out of his way to try to address issues that people raised with the idea that retaining talent long-term was hugely important, etc.), so I honestly don't know how much things would have been different if he didn't also have this level of retention of details. It was impressive still though, not in small part because I don't have any trouble imagining someone in his position just genuinely not caring enough to do it even if they were capable.

Maybe the genuineness that it seemed to come from really is what made the difference in the long run; I obviously don't know how everyone else felt about it, but in other jobs I haven't found it particularly difficult to notice when the general perception of higher-level managers is a lot more positive or negative than my own, so my instinct is that most people probably also liked him, and I do think that makes some amount of difference. Having a "glue" person who is more detail-oriented is probably fine if the reason the actual authority figure doesn't retain the details is just not having that particular skill, but if it's because they genuinely think that the people beneath them in the org chart are just resources they can use to solve problems rather than actual people who will work better in the long term if treated well, then no, I don't think it would be as effective.


LOVE this question! Thanks for asking.

I also like toying with variants of where "essential elements" can live, sometimes in odd places :)

https://x.com/patcon_/status/1963648801962369358

> I have an idea for a quirky event experimenting with the "minimum viable feeling of community", but need to explain some context first. Bear with me...

> [...]

> So here's the event idea: what if someone ran an event where the 2nd rule was "NO INTRODUCTIONS", but only because the 1st rule was "you must arrive having fully memorized ONLY everyone's name and face". Beyond the strange entry requirement, what would such an event feel like?

> And what strange sorts of intimacy might be created by this minimal scaffold of "knowing everyone"... & being in community together? I suspect it might feel like a warm event full of friends, but where everyone had mysteriously forgotten everything they knew about one another :)


Not to me. This post in question could be easily expanded into a recognizable Paul Graham essay and no one would bat an eye.


Arsenal of democracy is something that Detroit specifically was called during WWII, so historically it isn't a wild phrase. Ford Motor Company itself built complete B-24 heavy bombers too.

Edit: also an FDR quote https://en.m.wikipedia.org/wiki/Arsenal_of_Democracy


Yea, the guy is just old enough that he probably got exposed to those slogans when they were young.


Without more prominent melody or harmony I could not find what is finer about it than conventional approaches to jazz. Could you please elaborate on what its quality is?


To be honest, I am not a jazz musician or even a jazz fan. When I do record music I have better progress when holding myself to a standard or a set of limitations to work within. It gets the brain going to find new solutions to fit within a self-imposed framework.


Implicitly, the argument is that, when "cost and time of litigation scales like n^2 where n is the textual length of the law," justice for litigants declined when sheer access to the necessary legal funds and time began to outweigh other costs and benefits as a factor in determining pursuit of justice. Maybe it's not self-evident, but I don't think direct quantifiable evidence of justice is necessarily available, so what qualitative evidence would be capable of confirming support?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: