Hacker Newsnew | past | comments | ask | show | jobs | submit | zabzonk's commentslogin

In my experience, not a good idea to write both the client and the server for a given protocol which use each other. Far too easy to misunderstand the protocol. I remember doing this for a training course I wrote for OLE (later COM) years ago. The client and the server worked perfectly together, just not with correctly implemented OLE clients and servers.

Just slap a new legally-distinct-but-still-confusing name on your client/server pair, and use it as a marketing tool to sucker in purchasing managers.

Like EtherNet/IP, where the IP somehow stands for "Industrial Protocol".


Ah, but what if one your clients needs to use, let us say, Excel...

My mistakes with the training course code would have been fixed if the company would have bought Excel licenses fof our customer workstations.

And I just remembered it was DDE (dynamic data exchange), not OLE. OLE was much better specced than DDE. Like I said, it was way back when. But the basic rule (don't test using both a home-grown client and server) still applies.


My Dad had one of them. The first machine I actually purchased myself was a Dragon 32 (6809 processor, 32k RAM) sometime around 1981 - i can remember everything about it, including all the terrible cassette games I bought for it and the money I spent on ROM cartridges (word processor, assembler/debugger). These days I can't even remember what's in my Steam library.

> Linux should consider paying Microsoft and Apple

Who or what is the "Linux" entity in this context?


Joking aside, I often hear people say "they should" when talking about GNU/Linux (for example: "they should just standardize on one audio stack"), as if there were a central authority making those decisions. What many don't realize is that with FOSS comes freedom of choice... and inevitably, an abundance of choice. That diversity isn't a flaw, it's a consequence of how the ecosystem works.

There's free choice for those OSes to use different kernels, but they don't, they all use the same Linux (rather than say BSD). There's a lot of advantage in getting aligned on things, even though anyone can choose not to.

It is true that Linux-based distributions have this thing in common: the Linux kernel. There have been some GNU/Hurd variants though...

I guess Linus Torvalds and co? First they'd need to standardize a Linux desktop OS.

Also who is paying "Linux" and for what?

Maybe the answer ends up being Valve.


Well at least Microsoft is a platinum member of the Linux Foundation for many years...

probably him

> Coding agents are really good at tasks where you can define a concrete goal and then set them to work iterating in that direction.

Wholly based on other people's work. Which is OK.


Much easier to start with BASIC. After all, why not?

I respectfully disagree, BASIC/Java/Arduino hides too much about how the CPU works from users.

Getting a 6502 kit from Ben Eater, and walking though how the CPU works will implicitly show how languages abstracted away whats actually happening. And more importantly, the skills necessary to understand how to write efficient programs.

https://www.youtube.com/watch?v=LnzuMJLZRdU&list=PLowKtXNTBy...

https://eater.net/6502

Starting with a simple architecture is highly recommended. =3


I second this -- I just found the Ben Eater series a month or so ago and put together his computer clock over the holidays. It really helps you understand clock cycles, logic chips, etc, and is a good foundation for the 6502 kit you build later in the course. And learning Assembly before BASIC is the right learning path IMO, if only to understand how CPU registers work at the electron level.

If one is interested in how internal PC registers work, than these build series do the classic eeprom microcode based CPU builds. Fabian's series is highly accessible, and builds a python based assembler from scratch. James series ends with a simple game design.

Cheers, =3

"Build a Superscalar CPU" (Fabian Schuiki)

https://www.youtube.com/watch?v=bwjMLyBU4RU&list=PLyR4neQXqQ...

https://github.com/fabianschuiki/superscalar-cpu

"Making an 8 Bit pipelined CPU" (James Sharman)

https://www.youtube.com/watch?v=3iHag4k4yEg&list=PLFhc0MFC8M...


Starting with Assembly is simply a bad idea because the tooling is terrible, and the learning curve of the tooling is steep. Filled with arcane codes and abbreviations and workflow right out the gate.

Programming concepts are pretty much universal. Being distanced from computer architecture is not a limitation for novice programmers, Python et al succeeds for a reason.

If you're determined to start with assembly, then I hope you can find someone to help you get started with all the machinations necessary to get from LDA #0 to A9 00 with as little drama as possible. Someone to show you how to use the assembler, what the directives mean, the linker, a symbolic debugger (if you're lucky). Someone to provide you with a .DUMPREG "START OF SORT" and .DUMPMEM BUFF $80 "AFTER INPUT" macros that you can liberally scatter throughout your code so you actually progress and get some insight into what the heck you code is doing. Perhaps some way to stop your programs that doesn't include hitting the reset button on the machine.

I mention that because, again, the tooling is terrible. All of the is easier said than done. None of the assembly books address this, none of the assembly program reference guides do either. Assembly is VERY black box. It's a large step up to even get started.

It's much easier to "learn programming" first at a higher level, where you can quickly progress and succeed, before turning into the dark hole that is assembly, particularly on older machines.

At least on a KIM-1 you can hit the STOP button and cursor through memory (being conscious that the memory architecture of the KIM is quite funky), something that simple is quite difficult on an Apple ][.


In general, Assembly for a simple well documented CPU is fairly close to most familiar calculator operations, and is demonstrated as a 1 to 1 relationship in the binary firmware. If folks drop on abstractions like Scratch/Basic/Python/Java the students will develop a random notion of what Register/Stack/Heap even means.

I would recommend looking at a few random samples of Ben's build series, as he covers most first year subjects in subtle efficient ways.

Soldering kit PCB or Emulators are insufficient to demonstrate a physical bus wire harness, clock timing, and memory layout. Best of luck =3


Starting with the 6502 is going to bring you up hard against its addressing modes. Better IMHO to learn about memory and how to access it using arrays in BASIC first.

My opinion differs - learning how memory is accessed via assembly language will make it super easy to understand e.g. how C pointers actually work, something which can be surprisingly difficult for those who go directly to a high level (compared to assembly) language, but very easy if you come from machine code/assembly.

Depends on learning goals, as BASIC teaches people some really bad habits.

They say "one always ends up coding in whatever your first language was... regardless of what language you are using".

People could always bring up the BASIC software Rom at the end of the build if interest arises after learning how a simple computer works. =3

https://github.com/chelsea6502/BeebEater


It buys the launch missiles from the US, the submarines and the warheads are home-grown.

I really don't mind Windows 11, and don't recognise many of the problems other people here claim to have. For example, I simply don't see all (or any) of the ads that many complain about.

Yeah, I haven't seen these either. WSL is great, it's pretty nice looking, there's a lot of good stuff in Windows 11. My main gripe is inconsistency and falling behind the competition in speed (largely due to the chips and x86/x64).

Much of the outrage over Recall seemed excessive to me as well. People spun it as 'Microsoft is spying on you with AI!' even though it was never that in any way.

Seem to be missing some plausible definition of "intellectuals".

When this was written there was a clearer divide between people with higher educational training, qualifications and interest and those without.

I think it includes anyone who cares to read it.

I suppose it assumes the reader already understands the word, or has access to a dictionary.

EMP attack

There’s no such thing

The nuke part is optional, see:

https://en.wikipedia.org/wiki/Counter-electronics_High_Power...

that said, there have been multiple past nuclear EMP orientated tests: https://en.wikipedia.org/wiki/Nuclear_electromagnetic_pulse

results vary by location (earth's mag field) and pre hardening of infrastructure.


Thermonuclear weapon detonated in orbit

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: