Hacker News new | past | comments | ask | show | jobs | submit | more RyanCavanaugh's comments login

We've more than compensated for this in other building materials, processes, and codes. Your odds of dying in a house fire are far lower than they were even in the 1970's, let alone the 50's or 20's.


That's probably more because of smoke detectors (and perhaps fewer smokers) than anything else. I'd love to get a sprinkler system retro-fit though, as that would make an impressive difference.

All the lighter-weight joists made with OSB burn far faster than the 2x8s or whatever they replaced, and home furnishings are made with large amounts of flammable synthetics.

At a live-fire course I was on, the scenarios we worked on were fueled by stacks of wooden pallents, lit by an instructor's tiger torch. One of the instructors asked us if we knew the fuel equivalent of a typical love seat with synthetic foam, in pallets. We all figured it was lots, but not the real answer: NINETY.


They've done the research and, well, the newer houses really are just better at having fewer fire deaths. I suppose it's possible the fires that do occur are worse, but on net your death rate is lower in a newer house.

https://www.nahb.org/-/media/NAHB/advocacy/docs/top-prioriti...

> As expected, the coefficient estimate for the percentage of houses built after 1989 (pctpost89) is negative and statistically significant. This implies that, in counties with newer housing stock, all else equal, the fire death rate is lower. Interestingly, when identical regressions to model 1 were run using different cutoff points for new stock, such as the percentage of houses built after 1979 or 1969 or 1959, the coefficients were of roughly similar size, were always negative, and the associated t-statistics were at least as significant.


Interesting paper, thanks. It does make some of the same distinctions I did, around smokers and smoke alarms. Another thing mentioned about newer construction is the improved blocking and stopping. For example, one old style of framing was "balloon frame" construction, where you would have gaps that might run vertically from basement to attic. That gave fire a channel to rip vertically through a structure, and is clearly a terrifying idea once it catches. [Edit] Oh, I forgot to mention, it also discusses what conclusions can't be inferred. "Regrettably, much of the available data is not helpful. For example, no data are collected on the age of the structure where a house fire death occurs, despite the obvious link between the two."

The starting point of this though, was the idea that the materials in the house are actually better than in the past. To the extent that they'll tolerate fire longer before collapsing, they aren't, and the gases from the foam cushions, carpets and drapes are more toxic than ever. The reason this was drilled into our heads is that it means less time to get into a fire, and someone out, before we all have to leave for our own safety.


> Your odds of dying in a house fire are far lower than they were even in the 1970's, let alone the 50's or 20's.

I am very surprised by this.

I'm sure that building codes ensure that the actual houses are more fire resistant. And fire fighting has probably come a long way.

But the typical home is full of processed plastic fabric. Which burns a whole heck of a lot faster than either cotton or wool. Carpet, curtains, clothes, furniture, etc.


I am sure smoke alarms make a big difference and people not smoking. Circuit breakers instead of fuses. Plus all the for fire exits and fire doors in apartment blocks.


Perhaps, but I'm not sure I live swimming through an invisible ocean of fire retardant chemicals that are in all home furnishings and most clothing and so forth. I'm not exactly a California Prop 65 fan, but I do wonder if those are anything any sane person wants near them.


I'm here and I have no idea what you're talking about with ESM imports. Please log a bug.


https://github.com/microsoft/TypeScript/issues/54163#issueco...

>Hundreds of Github comments have been rehashed on this issue over and over again. Import paths are not modified during compilation, and we're not going to modify them during compilation, and it isn't because we haven't thought about it before and just need to spend another hundred comments arguing about it.

>The general tenor of discussion here is not great and I don't think this is likely to lead to any further productive discussion, so I'm just going to lock this.

I'm talking about this thing, apologies if I haven't referred to it precisely enough haha. So is it still "no idea", or is it actually "let's pretend we don't hear him, maybe he'll go away" (because you're understandably fed up with answering the same question many times)?

---

Listen, how about I just tell you why I'm asking you all those weird little questions. I still want to hear how you feel about being called a "god among mortals" for writing JavaScript for Microsoft, but I mean c'mon. You're a busy man, you just wondered "what's the ploy here" for a sec and decided it's safest not to bother answering them, right?

I understand that you, like any software engineer, like to work on impactful and meaningful things, and consequently take a measure of pride in your work; so could I, for a certain project, had I not allowed myself to be pressured into building it in TS - by people with no skin in the game, only the current majority consensus on their side ("all JS bad, but TS least bad").

Let's summarize the original post:

- Someone humiliates themselves, literally begging to be heard out.

- They bring forth a huge list of things that, beyond reasonable doubt, prove that their concern is valid.

- They are pointedly ignored, or rebuffed with some form of "I don't know what you are talking about" or "seems like a you problem".

Seeing this dynamic begin to play out in a feature request (out-of-left-field as the whole thing may be), well, that struck a fucking nerve, let me tell you. This has been my exact experience with TS (and no other programming language), when trying to address matters including, but not limited to:

- Whether to use TS at all (which I shouldn't have conceded to in the first place.)

- Whether TS can be adopted gradually (in my case it took multiple rewrites.)

- Whether I'm just imagining that I'm not gaining much by using TS (I'm not.)

- Whether I'm just imagining that I'm experiencing drawbacks from using TS (I did.)

- Whether TS is "just JS with types" (which is about as true as C++ being "just C with classes".)

- Whether the help implicitly offered, conditional on me using TS because "that's what everyone uses now", will ever materialize (which it didn't.)

- Whether I was using TS of my own free will (which is only true insofar as I rose to the challenge of accomodating others' supposed "discomfort" with good ol' JS, at my own expense.)

Every time I raised any of those questions, I was faced with gaslighting. And yes, in the end I did fall as low as begging my teammates for their help - after all, didn't I just rewrite working JS into nearly-working TS so that it would be more accessible to others?

Unrelated subsequent experiences taught me that the (former) coworkers in question might have been just as lost with TS, and even more lost with the JS ecosystem in general (whereas I feel mostly at home with it, coming from Python), but they were reluctant to take accountability and hence admit vulnerability. (Guess adding type annotations doesn't necessarily make code easier to comprehend or maintain, whether you come from a real dynamic language or a real static language, huh.) Well, whatever, that one's on them. And now it's on me to speak out against such insanity as enabled by your product.

When I see someone publicly putting themselves through the same situation (or "pretending" to - what's the difference when everyone is on so many layers of irony that it's not even funny anymore?), and it's not Clojure or Rust or Zig or Nim or Julia -- or JavaScript -- but fucking TypeScript once again; and it's not even their coworkers that they're addressing in this manner, ridiculous in its sheer desperation, but they're talking to the fucking upstream, then I'm not going to be a "good sport"; for me, this rapidly turns from "some stupid thing someone wrote on the Internet" into a matter of professional conscience.

Why did OP have to communicate in this acutely self-deprecatory manner? Is it perhaps because of a systemic issue in how the TS project handles feedback? Having personally experienced the exact same dynamic when discussing TypeScript, it seems such toxic communication has "trickled down" from upstream to us "cowering meek masses", i.e. the developers of Web-based software, i.e. the people who really should know better, because so much of what we build is intended to be directly consumed by other human beings.

For me, this is because TypeScript is not honest open source software. Say it for all to hear: am I wrong that the direction of TypeScript's development is determined by Microsoft's interests first, and the interests of the community a distant second? And is it not misleading and abusive in the slightest to have people learn "Microsoft-flavored JavaScript" instead of the real JavaScript that their browsers can execute, and pretend it's optional when practice shows it's anything but?

The stock phrase "incredibly privileged" makes me sick, but in this case you, Ryan, may truly not believe how privileged your position is in comparison to downstream developers around the entire globe. You're not the end-of-line code-monkey just trying to give the non-technicals some buttons they could click; you and your team are literally imposing your wills on a pre-existing community of fellow programmers, leveraging the unlikely synergy of Microsoft's marketing machine and the open source community's network effects.

On Monday, you will keep moving TypeScript onward; freely benefitting from the community's input on how to do what you're already doing, only better - more efficiently, more correctly. And just as freely ignoring the community's input on whether you're doing the right thing in the first place.

Meanwhile I'll still be recovering from the way your technology ended up impacting my life, no exaggeration, let's not even go there. If even 1% of TS users have been through a similar wringer as myself, that makes for how many people hobbled by your work? Calculate, and consider.

All the best.


[flagged]


Michael O’Church? Is that you?


Lemme check


Not to be flip, but if it were really all this easy, we would have done it already.

There are dozens of questions you can throw at this code: What if the input's a union? What if it's a nested union -- how do you avoid combinatorial explosion? What if the input is a function -- how do you validate its parameter types using runtime information? What if the input is a conditional type? What if you're inside a generic function? The list is enormous and it quickly gets into "you've dug too deep and unleashed a Balrog" territory once you get beyond the primitives.


Why don't you just make the transformer API stable, public, and let the community do the hard part? There's plenty of us that have experimental transformers doing all sorts of fun things, these are problems that can be solved external to TS.

I've got a fully functional compile-time dependency injection container that I've been sitting on for literal years because the transformer API isn't public.


^^^ Please do this. I'm completely ok with "using transformers voids your nonexistent warranty" and the community can deal with transformer API churn. Exposing the API makes it easier to adopt those community solutions, versus me needing to explain to teammates why I switched all the `tsc` invocations to `ttsc` and promise it's not that sketchy.


> Not to be flip, but if it were really all this easy, we would have done it already.

Typescript is mature. There is no low hanging fruit to be develop.

I don't necessarily think it needs this but this is hardly the right grounds to dismiss it.


I do not think that they mean that all easy things have been done, rather that if this feature was easy then it would already exist


It is easy. It's just a terrible idea.

(So as it turns out having thought about it a bit, I am vehemently against this idea.)

Type reflection at runtime would require polluting the JS environment with a lot of cruft.

That might be a global object and lots of helper functions to query types. It might also be tagging objects and fields with additional properties that need to be treated as reserved.

There is certainly no way to do this that doesn't make assumptions about the runtime environment in a way that will cause a mountain of issues further down the line.

The other reason for my disdain is: the need to infer types at runtime is almost certainly indicative of an architectural issue with your code. If you aren't able to write viable code in a given context without requiring runtime type querying then you should step back from your intent and re-evaluate your code structure.


I mostly agree, having every single type checked all the time like other static languages is not appropriate and probably not even a good idea but on the other hand it is also kind of annoying to have your validation code be divorced from your types totally.

For example suppose I have a web request coming in and I have essentially just a string of input coming in on the body. I have an expectation of what it needs to be but there is a step where it is of type `unknown` and I need to convert it into `IMyHandlerInput`. Just casting it is obviously a bad idea and so now I need to essentially use some kind of library such as ajv to do json schema or jtd validation, _then_ I can cast it into my interface.

This is all fine but it definitely feels redundant to me. It would be a cool _ecmascript_ feature to essentially support, not runtime types per-se but syntactic _validation_ which can also be used by typescript to derive types automatically.

This is totally hypothetical but I'm imagining something like this for example:

```ts schema Example { @min 0 id: number

  @pattern /^\w+ \w+$/
  name: string

  @future
  expiresAt: Date
}

const data: Example = new Example(JSON.parse(body)) ```

Something that the JS runtime can use to allow developers to opt-in to extensible validation which can also be used by the typescript interpreter to essentially derive types.


Maybe I'm very mistaken, but to me it seems this code snippet is basically an alternative to writing a ton of "infer"-s and overloads, no? So the same pattern matching could be used in the "switch". Whatever the complier knows can be locally matched, and the combinations have to be already handled by the developer.


It's hard to come up with a compiler that produces a sound checker for arbitrarily complex union/intersection types. Perhaps there could be a restriction on reflection to "simple enough" types, but that's always going to be a weirdly moving target based on heuristics. There's already cases where Typescript tries to generate ~40MB+ .d.ts files which are just re-stating the types themselves. So it's easy to imagine a validator compiler emitting 100MB+ of code to check more wild and crazy types.


Size, computation and other budgets seem like useful knobs to expose to developers. And anything that's locally decidable (so can be run on multiple cores easily).


Given that several projects exist in python to do this… I guess if you know about types at runtime, it is possible to do it.

I write one of them (typedload).

If it's a union, it has a bunch of heuristics to guess right at the 1st try, but otherwise it will just try them all.


Can we just start with primitives and see what happens.


JavaScript already has this. It's called typeof.


Typescript purposefully influences this not at all.


And instanceof


You can use the TypeScript API to generate this information at whichever level of detail you want.

The level of detail TS has about types during the checking phase is much higher than you would want in practice for 99% of projects (e.g. 1 + 2 + 3 has 6 different types associated with it).

The level of detail TS has about types during the checking phase is potentially lower than you would want in practice for a lot of projects (which is critical since that makes the whole feature useless if that happens). For example, the list of properties of a particular generic instantiation is lazily computed, but it's possible your program never pulls on the list so it never exists in the first place, yet is something your type-based tool might want to know.


They're not asking for reflection on _all_ possible types (which would have the problem you mentioned), just ones explicitly requested at compile time via a function call.


TypeScript dev lead here.

Today, you can already write a program using the TypeScript API to inspect all of this information to accomplish whichever scenario is at hand. The existence of all these tools, each with different opinions on design direction and implementation trade-offs, demonstrate that this is possible. Yet it's not demonstrated how those tools would benefit from reading from a data file as opposed to using the TypeScript API.

Something like api-extractor has different constraints from io-ts which has different constraints from typescript-schema. The API exists today and is apparently sufficient to meet all these tools' needs, yet what's proposed is a data file that can encapsulate any possible operation you might perform with that API.

Having TypeScript try to imagine, implement, and maintain a data file format that can satisfy _all_ of these tools' use cases, plus any future tool's use case, is a tremendous effort with no clear upside over using the existing API.

It's easy to gain support for the idea of "put the types in a file" because anyone reading that can imagine a straightforward implementation that just achieves their particular goals, but everyone's goals are different and the amount of overlap is not actually all that high when you look at the wide breadth of tools in the list. There's a very different amount of information you'd need to enable a type-aware linter (which is basically type info on every expression in the program!), as compared to a simple documentation generator (which might only need top-level declaration info).


Hard to argue with that! I wonder why this isn't considered a solution to the author of this site.


The author explicitly calls out the existence of all of these tools as being the problem. They seem to want a canonical version that TypeScript itself officially supports.


Situation: There are 14 competing type representation formats

TypeScript: We can write our own type file format!

Situation: There are 15 competing type representation formats


If typescript itself defines a format, it's hard to imagine that the majority of people won't switch to it.

Though while there's a lot of type-data tools, are there actually a lot of type-data formats out there right now? I've been using zod, and it doesn't have one. You have to use code to tell it about a type. Someone did make a library to let zod load types from pure data, but the format used there is... typescript source code.


Things rarely change at all once they're at stage 3. That's why TS waits until that stage before implementing. There might be a minor tweak in some edge cases but stage 3 is usually "done".


Runtime type systems: TS has none; C# has extensive RTTI

Function dispatch: TS is purely dynamic; C# has both static and dynamic

OOP: Mandatory in C#, optional in TS

Variance: Implicit in TS, explicit in C#

Numeric types: TS has one (number); C# has the entire signed/unsigned/float x 8/16/32/64-bit matrix

There's really no intentional effort to converge them.


TS has two. number and bigint[0], corresponding to Number and BigInt in JS, respecitvely

[0]: https://www.typescriptlang.org/docs/handbook/2/everyday-type...


Your list of differences is correct but not really what I meant. The points you listed are mostly due to JS runtime constraints. The choice of nominative vs structural type system is purely a design choice by the typescript team.


The US military made no claims that these videos showed unexplainable phenomena. They just sort of put them out there.

The video they called "Gimbal" shows an artifact caused by a gimbal mechanism, for example


This story is bigger than Watergate by orders of magnitude, if it's true.

There have been "Retired intelligence officer says he has evidence of aliens" stories floating in fringe media for decades, and all of them have turned out to be cranks. Bob Lazar has been working this beat since 1989; where's the evidence?

Maybe eventually one of these times it'll all be true, but skepticism -- extreme skepticism -- should absolutely be the default position here.


What is "extreme skepticism" other than ignorance?

A real skeptic actually looks at the data and isn't content simply by ignoring it.

Here, you simply imply this guy was a "crank". Maybe you should take a closer look first?


Great, where's the data? Right, he doesn't have any, and is asking for other people to come forward with the data.


Context matters. Is a "web developer" someone who makes web pages, or works on a browser rendering engine?


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: