Hacker Newsnew | past | comments | ask | show | jobs | submit | kawa's commentslogin

Their starship was bullshit. On earth they needed a big Saturn 5 like rocket to bring it into space and later they fly into orbit like it's nothing.

If the planets they visited had much lower gravity then earth this may be possible, but this wasn't noticeable or talked about. And even on Mars you need much more fuel to get into orbit than could be stored in their tiny ship.


To expand on this, if they tried to land on the water planet where time so dilated that one hour equals 7 months… what would their velocity have been when they contacted the surface? And how much energy did their spaceship need to reach escape velocity from there?


Well, if you look for this, many if not most movies will have some inaccuracy or holes in explanation like this. In the end, I watch movies for entertainment. If I'm looking for scientific precision, I see a documentary instead.


Yes. Maybe don't make scientific accuracy a big part of your marketing campaign though.


Still better than what Ad Astra! :D

Liked the killer space monkeys though. ;-)


'Ad Astra' was a real stinker.


Eh, this isn't actually that bad. It's quite easy, using a jet aircraft, to (via a ballistic trajectory) get into space. Surrounded by reaction mass as you are, the rocket equation isn't nearly so bad. You could conceivably dock with something in orbit, if you had elastic tethers or something to make the acceleration survivable. If you're much lighter than what you're docking with, you won't knock the satellite out of orbit.

Heck, you could even ignore the jet aspect, and go for full rocket. (An ICBM only weighs around 50 tonnes, after all.) Getting the heavy thing into orbit, though, requires proper, multi-stage rockets.


>It's quite easy, using a jet aircraft, to (via a ballistic trajectory) get into space

A jet engine requires atmosphere. Spaces requires no (or very little) atmosphere.

The edge of space is usually defined at 100km (the Karman line).

The highest altitude reach by an air breathing jet is around 37km. https://en.wikipedia.org/wiki/Flight_altitude_record

A scramjet might get to ~75km according to: https://aviation.stackexchange.com/questions/44837/what-is-t...


That's the point at which the jet stops working, but by that point you can be going quite fast. If there isn't enough air to run the jet engine, there also isn't enough air to slow you down (much).


Assuming no air resistance and 10m/s^2 gravity. I calculate that you would have to be doing ~1,100m/s (~mach 3.2 at sea level) straight up at 37km to reach 100km. Or ~1,600m/s (~mach 4.7 at sea level) at 45 degrees to perpendicular.


Seems about right. https://en.wikipedia.org/wiki/Project_HARP (1966) got 2100m/s from a ground-based cannon: this got the projectile to 179km.


(Spoilers) I think that the part about the guy who's brain traveled to the Trisolarians and communicated back by telling a fairy-tale may be a story about the author himself: He's also in an authoritarian regime and unable to tell things face value, so he communicated by telling a "fairy-tale" (SF in his case) full of metaphors which needs to be decoded by the reader first to get the real meaning.

Ok, maybe I'm reading to much into it, but if it was to obvious, Liu would get intro trouble, so plausible deniability is important. But I think that the books are much more political as many think. But you need to decipher it first...


> The best models

Yes, models. But a model doesn't need to be "real", it just models something real to a certain extent. But models tend to break down if you look close enough, and I think this may also happen to QM at some point.

Bell's Theorem doesn't rule out determinism, it only rules out hidden variables. If the universe is non-local, Bell's theorem fits well with determinism.

> We don't live in a billiard ball universe

We do - at least as long we look at clumps of matter. The billiard ball universe breaks down if we look at the constituents of matter but somehow it re-remerges if we put enough of those constituents together. It's probably the biggest riddle in Physics why this happens. But it does.


> But models tend to break down if you look close enough, and I think this may also happen to QM at some point.

Sure, but there's absolutely nothing to suggest that it will be some kind of deterministic computation underneath.

> We do - at least as long we look at clumps of matter.

Not even. Even non-quantum clumps of matter are influenced by continuous fields and dilation effects from both special and relativity. Even without QM, our universe is not efficiently simulatable on our computational models because of general relativity.


> Sure, but there's absolutely nothing to suggest that it will be some kind of deterministic computation underneath

The universe behaves very deterministically if we look at "clumps of matter". Why is it this way when this determinism isn't already part of the "base"? For me that's at least a "suggestion". Not a proof of course, but still a hint.

> ... because of general relativity.

General relativity doesn't fits together with QM, so either one is (or both are) "wrong" (in the sense that they only approximate reality to a certain degree).

I'm even sceptical about special relativity: It's a good model and works well in most occasions, but it may still be wrong on a fundamental level. Most of the assumptions under which Einstein proposed SR (no QM, static universe) don't hold anymore.


> The universe behaves very deterministically if we look at "clumps of matter". Why is it this way when this determinism isn't already part of the "base"? For me that's at least a "suggestion". Not a proof of course, but still a hint.

Just because a system is randomized doesn't mean it's not predictable: when measured in certain ways, it will statistically tend to clump around certain states. Suppose that every second, I flip a magic random coin and walk either 2 feet forward or 1 foot backward. Then after a million seconds, you'll quite probably find me about half a million feet from where I started. Small-scale random processes can easily create something predictable on the large scale.

Still, I wouldn't characterize "clumps of matter" as being deterministic even in our everyday lives. There are many chaotic systems in this world, e.g., the weather, which can amplify randomness on the molecular level into a completely different state. Even the orbit of the Earth becomes unpredictable after several million years.

> I'm even sceptical about special relativity: It's a good model and works well in most occasions, but it may still be wrong on a fundamental level. Most of the assumptions under which Einstein proposed SR (no QM, static universe) don't hold anymore.

Special relativity is already 'wrong' in that it doesn't predict any of our observations of general relativity. But it unavoidably has plenty of truth in it, in that it is very succesful at predicting an identical speed of light for all observers, and the effects (e.g., time dilation) that that implies. Any superseding theory has to explain the same observations, at which point special relativity will continue to act as a useful model for the large-scale effects.


> Just because a system is randomized doesn't mean it's not predictable

That's of course true (In fact I tend to also believe in a non-deterministic universe "at the core").

But if determinism falls out in the end, it's still a hint that there may also be deterministic effects at the root. Current observation can't rule that out, it's just our model which assumes pure randomness. But there are lot's of possibilities how randomness can sneak in into QM which doesn't contradict obserservation.

And unless we solve the measurement problem in QM (by finding a unified theory from which both Schoedingers equations and Borns rule can be derived), it's still an open question. So considering it solved today is quite premature.

> ... chaotic systems ...

That's still deterministic. Sure, there may be some influence from quantum effects which then are amplified, but the dynamic of the chaotic system itself is still deterministic.

> (SR) ... predicting an identical speed of light for all observers

That's not really true. "identical speed of light for all observers" is an observation which was replicated quite often. SR is a way to explain this observation, but there before SR Lorenz already had a different model explaining it too. SR won, because Lorenz used an (at the time) unobservable "ether" and Einstein argued that its better to use Occams Razor and throw this "ether" away.

But Einstein didn't now about QFT, the Big-Bang and the microwave-background - which all contradict Einsteins assumptions: QFT uses an "ether-like" vacuum, the Big-Bang created a "T=0" for the universe and with the microwave-background also an absolute reference frame for an absolute time. This in all contradicts SR, so maybe SR is really wrong on a global level.

Which in turn would allow a non-local, realistic interpretation of quantum measurements because without SR simultaneity could be back on the table.


> But if determinism falls out in the end, it's still a hint that there may also be deterministic effects at the root.

What I'm saying is that it's a hint of absolutely nothing. Deterministic systems can very easily produce deterministic large-scale behavior, and randomized systems can also very easily produce deterministic large-scale behavior. Since the large-scale behavior is the same either way, it gives us no predictive power over its ultimate cause, in the Bayesian sense.

> That's still deterministic. Sure, there may be some influence from quantum effects which then are amplified, but the dynamic of the chaotic system itself is still deterministic.

Your argument is that because we see "determinism falling out in the end", we should also expect "determinism at the root". But I argue that in the real world, we don't even see "determinism falling out in the end". On short timescales, computers appear to simulate finite-state machines, and the Earth appears to move in a steady pattern around the sun. But looking further out, the computer ultimately turns to dust, and the Earth wobbles out of its current path, thanks to the chaotic dynamics of the solar system. That doesn't sound very deterministic to me, unless we baselessly assume a priori that they have a deterministic cause.

What determinism do you argue does truly fall out in the end?

> That's not really true. "identical speed of light for all observers" is an observation which was replicated quite often. SR is a way to explain this observation, but there before SR Lorenz already had a different model explaining it too. SR won, because Lorenz used an (at the time) unobservable "ether" and Einstein argued that its better to use Occams Razor and throw this "ether" away.

In that case, we have two different interpetations that yield the exact same outcomes. Thus, I'd say that they're really just two different descriptions of the same model: they're equally correct, and Lorenz's description is just dispreferred due to being more difficult to work with.

> This in all contradicts SR, so maybe SR is really wrong on a global level.

There's nothing in SR that says that "most" matter can't follow the same reference frame. It just says that your reference frame has no bearing on the laws of physics you perceive, contrary to older models of the ether.

As I said, we already know that SR is wrong in that it doesn't predict any of the effects from GR, cosmology, etc. It's not an end-all-be-all theory of everything. But it doesn't stop it from giving good predictions for most places in the universe.

> Which in turn would allow a non-local, realistic interpretation of quantum measurements because without SR simultaneity could be back on the table.

You can do all that today, by specifying a reference frame that you want to consider. After all, that's how QFT does it, since it's mostly concerned about local effects. But you won't get different results from what SR predicts (in particular, the physics won't change if you look at the same system in a different reference frame), except in the circumstances where we already know it's incomplete.


> What determinism do you argue does truly fall out in the end?

Mechanics is fully deterministic. The question is if there is some kind of "QM random generator" which mixes into this, making things nondeterministic in the end. But it's possible to separate both and the "big clumps of matter" part is fully deterministic then because decoherence generally happens so fast that it doesn't matter. You need to prepare systems quite carefully to mix quantum randomness into it (like in Schroedingers cat for example).

> In that case, we have two different interpetations that yield the exact same outcomes

Only for "harmless cases". SR allows lots of strange stuff, especially if combined with gravity. Closed timelike curves for example.

But if time is absolute and only slowed down for objects moving against this background, then closed timelike curves couldn't exit. Also the trick with Kruskal–Szekeres coordinates wouldn't work anymore because switching time and space would by unphysical. This way we wouldn't have to care about the singularity (at least in Schwarzschild BHs) anymore, because space would cease to exists behind the horizon of a BH and there would be no Singularity.

> You can do all that today, by specifying a reference frame that you want to consider

But that wouldn't work with measurement of entangled object, because there would be no way to define an absolute frame in which the change of the wave-function into an eigenstate happens, it would always depends on the frame of the observer. QM requires that the change happens simultaneously, but SR doesn't allow simultaneous events.

Of course the problem with all of this is, that in the moment I can't see a way to do experiments which decides if there is absolute time or if the SR is correct.


"Thou" sounds very similar to german "du" which is the current informal form. In older german the second person plural ('Ihr', similar to "vous" french from which "you" may come) was also the formal form, but it's out of fashion for a few centuries now.


Indeed, it's the same word! In the ancestor language of both English and German (and also Dutch, Low German, and many others) it's been reconstructed as "þū".


What is the ancestor language, and how do you pronounce þū?


proto-West Germanic by current thinking (at least, by Wikipedia)

https://en.wikipedia.org/wiki/West_Germanic_languages#Validi...


An "unpure FP" is a procedural programming language, because "unpure functions" are generally called "procedures".

But people tend to avoid the name "procedural" at all cost - which is bad because procedural programming really has it's advantages and should be clearly separated from FP which also has certain advantages.


Lisp is not a FP. It's a "List processing language" with some features based on Lambda calculus.

Real FP means referential transparency and that started with languages like Miranda and later Haskell.

What many people consider "FP" today is in fact just procedural programming with higher order procedures and lexical closure.


Lisp is even not a language, it's a family of languages. It started as a "LISt Processor", but it

A lot of early FP teaching was done in Lisp. Some kind of "Pure Lisp" was used, which are imaginary subsets of Lisp, restricted to side-effect-free, non-destructive functions.

FP started quite a bit before Miranda.


It's a little hyperbole but in principle it's true, especially in Germany. Of course you can like the concept of redistribution of income and wealth, but that doesn't makes it "blatantly wrong and reactionary".


Yes, it's quite complicated now (but it's getting better).

But it's not as easy as replacing it with something simpler. If you do, you will discover all the things you can't do with it. So you start to improve it, add features, etc. And after a short time, it's probably even more complex as html/css and still not as flexible and powerful.


There's violence and there is violence. While I can enjoy an action movie or an UFC match, watching images from the Holocaust still makes me sick.


It's a gradient: Fake violence like movies and WWE, "consensual" violence like MMA, and non-consensual violence like atrocities.


If someone tries to explain stuff based on the "human psyche", it's important to understand that people aren't the same. If someone talks about "we" or "humanity" as a whole, he/she doesn't takes this into consideration - which makes the whole argument questionable.

An estimated 2-5% of all humans are so called "psychopaths", people without the ability to feel empathy for other peoples pain. Those people are at the same time overrepresented as inmates in jails as well as in holders of executive positions. It's totally possible that most violence in human history is a result of those kind of people while the remaining 95-98% are simply victims of those people.


>If someone tries to explain stuff based on the "human psyche", it's important to understand that people aren't the same.

It's also very important to understand that people are more alike than different.

The "not the same" thing I've found to be mostly based on superficial differences, whereas all the basic imperatives (sex drive, evolutionary instincts, core cultural values, etc are mostly the same wherever).


While people are of course more alike to each other than, for example humans and cows, there are still a lot of differences between single persons. People tend to assume that other think as they do, but that's one of the most common sources for misunderstandings.

There are people who can torture a child to death without feeling the smallest amount of guild (maybe they even find it funny or satisfying), others who could do it with the right amount of indoctrination, others who would only do it if their life is in danger (and would feel horrible afterwards) and others who would rather die than doing it.

And it's not possible to shift a person people into all of those categories, it's mostly inborn (like sexual orientation, hair color, etc).

Sure, all those people have lots of other "basic imperatives" in common, but the small difference that some could inflict as much pain as possible to others without feeling even a little bit guilty is a difference with should have quite severe consequences for society as a whole.

So if we try to understand why there is so much violence in the world, it would be quite an oversight to not take those kind of differences into account.


"And it's not possible to shift a person people into all of those categories, it's mostly inborn"

You're making a helluvan assertion there. You're going to need a citation, because psychology has no clue about this.

-signed, A Psychopath


Are you doubting the results of personality research or that personality is genetic? A simple google search would tell you that there is plenty of evidence that personality is partly genetic:

https://www.google.com/webhp?sourceid=chrome-instant&ion=1&e...


I'm doubting the sentence quoted.


An estimated 2-5% of all humans are so called "psychopaths", people without the ability to feel empathy for other peoples pain

And another 90% have herd mentality, people that can easily mold their personality to match the dominant type in their in-group. There have been numerous studies that show that group mentality can override people's personal moral boundaries.


90%? Could you please provide a citation for such a large number?


Psychopathy might be a factor but I don't think the instinct for violence is limited to psychopaths.

Non-psychopaths simply dehumanize their victims to suppress that empathy.


Psychopaths have the ability to not feel other people's pain. Super important distinction. Unless you're just hate-mongering. In which case sure, they're as good a scapegoat as any.


There seems to be some evidence that people who are highly psychopathic do have underlying physiological differences which give rise to reduced empathy: https://www.sciencedaily.com/releases/2009/08/090804090946.h...

I think we all have the ability to not feel another's pain. For instance, dehumanization is a classic technique in war propaganda.


My notion of these people (developmental and increasingly genetic) is that they exist as the scar tissue of humanity. I fear this new propaganda will only leave more scars than the old.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: