Hacker Newsnew | past | comments | ask | show | jobs | submit | suprfnk's commentslogin

No I can confirm this. I am at least an average C# dev, with 16 years of experience.

I have built a very nicely responsive real-time syncing iOS app in what amounts to a weekend of time. (I only have an hour here and there, young kids) I had zero iOS/Swift development experience prior to it.

I can also confirm that this wouldn't have been built if it weren't for Claude Code. It's "just" an improved groceries app, that works especially well for my wife and me.

Without LLM's, and with just an hour here and there, I wouldn't have done the work to learn the intricacies of iOS and Swift dev, set up the app, and actually tweak and polish it so it works well -- just to scratch the itch of a bit better groceries handling.


C# pays fine


If you like one, you’ll prolly like the other

Hell you might even like ActionScript ;P


If you like TypeScript and C#, then you'll probably also like Delphi and Turbo Pascal!

They were all written by the same guy, Anders Hejlsberg:

https://en.wikipedia.org/wiki/Anders_Hejlsberg

https://news.ycombinator.com/item?id=19568681

"My favorite is always the billion dollar mistake of having null in the language. And since JavaScript has both null and undefined, it's the two billion dollar mistake." -Anders Hejlsberg

"It is by far the most problematic part of language design. And it's a single value that -- ha ha ha ha -- that if only that wasn't there, imagine all the problems we wouldn't have, right? If type systems were designed that way. And some type systems are, and some type systems are getting there, but boy, trying to retrofit that on top of a type system that has null in the first place is quite an undertaking." -Anders Hejlsberg


  > "My favorite is always the billion dollar mistake of having null in the language. And since JavaScript has both null and undefined, it's the two billion dollar mistake."
  > -Anders Hejlsberg
Why can't all-1s be null? E.g. a small int goes from the range 0-255 to the range 0-254, but we get a really useful property with no out-of-band Nullable overhead.

With signed ints it even leads to symmetric ranges in the negative and positive directions.


The FORTH-83 standard changed FIG-FORTH's official value of TRUE from 1 to -1 so all its bits were set. That was a rough transition like Python 2=>3, but worth it. It also defined /MOD integer division to be floored (rounded towards -infinity instead of zero like FIG-FORTH), which was also a tough change but the right one, especially for graphics.

https://python-history.blogspot.com/2010/08/why-pythons-inte...

https://forth-standard.org/standard/diff?utm_source=chatgpt....

https://atariwiki.org/wiki/Wiki.jsp?page=Converting+FIG-Fort...

>4. For various reasons the definition of all divide functions general effect is that quotients are floored instead of rounded toward zero. This should cause no problems for most pre-existing application software. The new divide functions are marginally slower than the old (a few machine cycles under most circumstances). The side-effects of the redefinition for floored divide can be counter-intuitive under some circumstances. For example, in FIG-Forth the operation

      -40 360 MOD
>would return the obvious answer (-40) on the stack, while 83- Standard Forth will return the answer 320!

>5. The true flag returned by all logical operations has been changed from the value 1 (in FIG-Forth) to the value -1 (in Forth-83, all bits set). If your code used the 0 or 1 returned by a comparison in an arithmetic operation, you will need to interpolate the operator ABS after the logical operator. This is a particularly difficult problem to look for in your source code. However, we feel that this mutation in the 83-Standard was beneficial as it allows the returned true/false value to be used as a mask for AND.


Hello Don!

I always suspected that FORTH had inconsistencies in division across versions. That's why the lord told us to Go FORTH and Multiply instead.


Delphi has been dead for 10+ years. Nobody uses it except for a few legacy applications and licenses cost $1200+.


You're kind of missing the point. Turbo Pascal has been dead for a lot longer. Or is it?

The point is that TypeScript and C# are extremely similar for a good reason, not a coincidence, and that Anders Hejlsberg knows what the fuck he's doing and talking about, and has been implementing amazing groundbreaking well designed languages and IDEs for a very long time. Turbo Pascal was so great it flummoxed Bill Gates, so Microsoft sent a limo to recruit and hire Anders Hejlsberg from Borland, then he made Visual J++, Windows Foundation Classes, C#, and TypeScript.

https://en.wikipedia.org/wiki/Turbo_Pascal

>Scott MacGregor of Microsoft said that Bill Gates "couldn't understand why our stuff was so slow" compared to Turbo Pascal. "He would bring in poor Greg Whitten [programming director of Microsoft languages] and yell at him for half an hour" because their company was unable to defeat Kahn's small startup, MacGregor recalled.

https://news.ycombinator.com/item?id=8664370

>"According to the suit, Microsoft also offered Mr. Hejlsberg a $1.5 million signing bonus, a base salary of $150,000 to $200,000 and options for 75,000 shares of Microsoft stock. After Borland's counteroffer last October, Microsoft offered another $1.5 million bonus, the complaint says."


C# is nominally typed, which, in practice, leads to safer code and less type gymnastics. Of course you can avoid the type gymnastics with "any", then you you're sacrificing safety.


“A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.”

― Robert A. Heinlein


Great quote ... but: Warning from experience: do not try using that last part in a job application.


Or the first part.


A tiny bit of this is already here: our robot vacuum & mop vacuums and mops the living room, kitchen, and dining room every night at 01:00.

Coming downstairs in the morning to a completely clean floor is definitely a tiny bit of that magic.


Having a mop-capable robotic vacuum myself I have a completely different experience. It is simply too stupid(despite being a smart model with camera and room mapping LiDAR) and get stuck at carpet edges and under chairs.

If I want to use it on schedule I need to perpetually have all areas I want it to clean adapted to robotic vacuuming. Which I don't, meaning I have to manually go over the entire area and pick up objects, move chairs, move the small carpets and then empty the all too small storage bin on the robotic vacuum after it have done its rounds.

And don't even get me started on the mopping function.

The end result being that if I take a regular vacuum in my hand and do the pre-robot screening round, I've managed to already vacuum the entire flat with a much more powerful machine in less time than the robot vacuum process would've required.


Yeah, I demand more info about the mop from the OP. What model do you have and are you completely satisfied? I was under the impressionthat most of wet mop models just smear dirt everywhere instead of really cleaning a dirty floor.


Our robot vacuum gets stuck under the bookcase and tangled up with my laptop charging cable.


I agree with the sentiment, but: how do you navigate? Especially to unknown places?


My phone works great for driving directions. I plug it in and turn up the volume when I'm driving by myself. I plug it in and turn off the volume and give it to another human to tell me what to do when I have passengers I can trust to navigate for me. I only really need to use the phone for navigation like 2 or 3 times per month, at most. My daily commute to the office never has traffic and in nicer weather times I often ride my bike.


@wwilson How do you define the X/Y "distance" of a non-Mario application? I.e. any (distributed or not) system that doesn't have a relatively trivial "higher x/y is better" fitness function?


Being by far the most convenient way to travel from home to pretty much any random location.

Speaking from the Netherlands with a relatively good public transit system.


That's not an externality. That's what you pay for.


Other people, not just you, also benefit from your increase in flexibility.


Do they? I guess occasionally my family and a few friends benefit when we can meet somewhere they haven't bothered to support with proper public transport. I'm not sure I'd class that as an externality.


It's not the most convenient because of cars. It's more that the other options are less convenient because of cars.


This is a pretty short-sighted take. VR just doesn't offer much of anything currently. Some games work well, but most games are pretty clunky. Office work isn't substantially improved. A big warm sweaty headset is not nice to have on your head for extended amounts of time.

There's a plethora of reasons why VR in its current state is little more than a gimmick.


It just doesn't matter that much for the average consumer. Most office workers (that is, not silicon valley programmers) work on 1920x1080 and that's fine. There really is not much to gain by doubling or tripling the resolution.

I'm working on a 27" 2560x1440 screen. I can see pixels, but that really doesn't matter. Text is readable, nothing is blurry, I can do my work and get on with my day. Screens are good enough, they work, and there is not much to gain by having higher resolutions.


I feel like text that isn't crystal clear stresses my eyes more. I can feel eye fatigue and blur after an intense day of work.

I would more than welcome higher density and crispier text any time. I'm on 24" 1080p and unhappy of that.

I'm gonna buy a widescreen 34" at much higher pixel density soon.


Easy: because TypeScript or Python are way easier to learn than C. Learning C is a long, arduous, uphill battle against arcane error messages and undefined behaviour.

Unless you have a background in C/C++ already, most people can probably get up and running with something like this way, way faster.


Good luck understanding things like `if(!!!c) { ... }` or why a line-break after a return statement matters in JavaScript/TypeScipt ;) JS has its own footguns and legacy baggage.


I've never seen `!!!` in JavaScript, and I do a lot of it. Care to share?


Shouldn't have made an example in the if-statement as it is mostly useless there. But triple ! is very common to negate-and-convert a possibly falsy statement (undefined, null, false/true):

const x: boolean | undefined | null = getValue(); const not_x: boolean = !!!y

I added TS type annotation for clarity, although could be inferred if `getValue` is typed accordingly.


I've seen `!!` and I've seen `!`, but what would `!!!` get you here that the other two don't?


negation plus cast to boolean. See this for more info: https://stackoverflow.com/questions/21154510/the-use-of-the-...


And line breaks after return statement? Is that true? Haven't stumbled on that one. So this is probably misinformation.


No, that one is true. JS automatic semicolon insertion is dumb.

   return
     <p>
       A JSX paragraph
     </p>
is a common mistake from novices; it'll return void/undefined.

In TypeScript, at least, you get yelled at for this.


How do you get that TypeScript or Python environment on the chip of your interest at the first place? How do you expose hardware interfaces without knowledge of C?


> How do you get that TypeScript or Python environment on the chip of your interest at the first place?

By having somebody else do it. Abstraction is a wonderful thing.


That's just kicking the can down the road. What if you are working on device which is under NDA? What if it is some exotic MCU which nobody else uses?


Then you probably shouldn’t use this. It’s not for you, that’s cool, move on and use whatever you’re currently using.


I am just showing you that DeviceScript/MicroPython/LUA/any other scripting language will expect from the user to know lot of C in order to be able to use its board unless they want to just run it without any input/output of data. But users want to use the scripting language because they don't know C. The whole flow is Catch-22.


I might have agreed 10 to 15 years ago when arduino was brand new and almost everything was custom.

These days... eh - pretty hard disagree with everything you've said.

Do some folks still need to know the ins & outs of the device? Sure. Will this work on every device? Nope.

Does that matter for the success of this project? Not a fucking bit.

Honestly - this looks a lot like Electron in my opinion: It gives companies a very cheap entry point into a whole realm of tooling that was previously out of bounds.

They can do it without having to hire new folks, they can prototype and run with it as far as they'd like, and then 3 years in, once the product is real and they know they have a market - they can turn around and pay someone to optimize the embedded devices for cost/power/performance/other.

The flow isn't catch-22 AT ALL. The flow is: I'm trying to do a thing that's only marginally related to the embedded device, and it's nifty that I can do that with low entry costs (both financial and knowledge).

---

By the time you are under NDA for a new device... you are established enough to be making your own decisions around tooling (basically - you are part of phase 2: optimize).


> The flow is: I'm trying to do a thing that's only marginally related to the embedded device

It's too bad that this comment is buried so deep: it should be at top level. More and more often, embedded work is just like this -- the business logic is far more important than the fact that it's running on an "embedded device." And in those cases, having programmers who understand modern software development at a high level is far more useful than having programmers who are expert in C and comfortable sitting down with multiple chip datasheets for a week, writing peripheral drivers.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: