Hacker Newsnew | past | comments | ask | show | jobs | submit | more lchengify's commentslogin

This only would have worked if it was animated. The 1977 Hobbit film which was produced for NBC [1] was a excellent adaptation, and Yellow Submarine is in the same style. I would argue Yellow Submarine also has many of the same themes, if not more absurdist than fantasy.

Also fun fact I just realized you can stream Yellow Submarine for free on archive.org [2]

[1] https://en.wikipedia.org/wiki/The_Hobbit_(1977_film)

[2] https://archive.org/details/yellow-submarine-1968_film


> The 1977 Hobbit film which was produced for NBC [1] was a excellent adaptation

I've not seen the 1977 Hobbit film, but I note that your reference [1] prominently says: "The Tolkien scholar Douglas A. Anderson called the adaptation "execrable"; the author Baird Searles called it an "abomination" and an attempt that had "failed miserably", regretting the quality of the animation and of the soundtrack, and the omission of key plot points."


The 1977 Hobbit has its issues, but having watched it recently I feel like it does well with capturing the vibe of the book, especially the first half (it does really fall apart later). I personally find Peter Jackson's version of the Hobbit unwatchable for the same reason... he takes the LOTR vibe and imposes it onto a much more innocent story.


The thing that I like about the Hobbit 77 film is that it realizes the material is a child's fantasy book. It's a breath of fresh air in a our current era -- where VCs seem to understand that any attempt to profit off of the American taxpayer's defense spending requires some Tolkien reference in the company name.

The existence of the '77 animated adaptation is our last reminder that the Generals green-lighting all this spend are literally mid-witted man-children.


That Thiel named his company Palantir has always struck me oddly.

In the books, the palantir were "seeing stones" gifted to men by the elves, and could be used to see events remotely, and to communicate mind-to-mind by users. But after Sauron acquired one, using them became perilous as the Dark Lord could limit what users saw, effectively controlling their information flow, even if he could not overwhelm or twist them directly via the mind-to-mind contact. While Aragorn does manage to wrest one free of Sauron's control, allowing him to save the day at a key moment, overall they are a powerful tool for the forces of evil.


I think it's just because he had no illusions as to the good and bad uses it would bring. I've used Palantir Foundry heavily at work, and it is good for remotely viewing events and communicating mind-to-mind to executives with pretty dashboards. Definitely nicer optics than their Gotham platform used by USA law enforcement since e.g. it helps Airbus identify issues on their plane fleets before they occur.

Plus from talking to the Palantir engineers, the CEO and Thiel are both weirdo nerds, so it's fitting.


If you’re going to build a powerful tool for governments don’t you want a constant reminder of how even tools for good can be corrupted by evil?


I put the probability of them having that mentality when choosing the name at literally zero.


It's not even the specific name, but simply using any name from Tolkien.

Tolkien didn't like the consequences of technological progress. While engineers were not inherently evil in his works, they were at a high risk of becoming evil, or at least instruments of evil. Think of characters like Sauron and Saruman, or Fëanor and Celebrimbor. If you give a Tolkienian name to your tech company, you are implying that you are one of those guys.


If you realize that Peter Thiel intends to use Palantir exactly for that purpose (controlling information flow, and as a surveillance system), it makes complete sense.


Having worked on three- and four-star staffs while I was in uniform I can assure you that they are decidedly NOT that. The folks who make it to that level are sharp as hell and usually do not suffer fools lightly.

Your glaring anti-military prejudice aside, the "spend" you refer to is Congress's job, not that of senior officers.


The Beatles had nothing to do with the production of Yellow Submarine apart from money until the end.


Ok I didn't believe you at first, but holy crap you're right, they didn't even do the voices.

Not going to lie part of my childhood died a little :(. "Help" is them though I watched that too.


It's still a great film.


How is that film just available for free like that?


Almost any released movie of any note, and a great many of no note are available for free.


Presumably because no one's made them take it down.


I used to use the WeWork on the bottom floor (technically partial basement). This is definitely the vibe: Huge open floor, no windows. WeWork did an OK job trying to light it but, I could never shake the feeling that it felt like a scene from Fallout.

My two cents: If this building ever becomes popular again, it'll be because of the location and not because of the building itself. It's reasonably close both to "Van Mission" (the rebranding of that part of Market Street for high rise residential), a BART stop, and Hayes Valley. It's probably one good Twitter-esque city tax subsidy away from being fully occupied in 5 years.


This is correct. For the most part, you are taking the money from your profit centers to invest in new initiatives. Could be geo's, could also be verticals, etc.

This is not at all unusual for a company where the top line is growing. As a point of comparison, Amazon was founded in 1994 and not profitable until 2003.


Difference with Amazon is the barrier to entry to creating a worldwide distribution business is far, far greater than creating a ride hailing app. Amazon was losing money to invest in valuable real estate and buildings and other big ticket capital items that retain worth, whereas Uber is basically giving away subsidized rides hoping that people will pay more in the future.


I spoke with an Uber driver about this recently, and one of Uber's barriers to entry is their phenomenal data on roads. As an example, they know for every apartment complex, where you'd drive around inside to drop off food as oftentimes you can't just go to the front. Or which parts of a road have no stopping zones along with the schedules. Bus stops too. And the app will guide you to get a pickup where the Uber driver is actually allowed to stop.

Drivers are interested in those features because it makes them more efficient. And having a critical mass of drivers is what makes it possible to get a ride in a few minutes. There are other upstarts, but they don't have many drivers, and your potential user market doesn't scale linearly with drivers because nobody wants to wait 30 minutes to get a ride (even with crazy discounts).


I am an Uber driver. I'm sure they have the things you describe in some places. Where I am, the app regularly tells me to drive off the side of a bridge because it believes there is an at-grade intersection in a highway. Despite my repeated reports of the issue, it persists. All the Uber maps in my area are at least 1-2 years out of date and are nearly useless for navigation. I pretty much ignore them a lot of the time.


This was one of my experiences with Uber versus some other ride hailing apps. The Uber app seemed to understand the airport and would help guide me towards a pickup zone and help me relay to the driver what pickup zone I was going to be at, while at least last time I used them other ride hailing apps tended to just try and show wherever I was at the time.


Amazon and Uber are definitely different, but in 2003 Amazon's market cap was about $21B, or ~$36B in today's dollars. Uber is actually bigger than that in value today (~$144.11B).

Amazon was far from a dominant player in 2003, and AWS wasn't launched publicly until 2006.

From a product standpoint, as others have stated, Uber is a real-time services marketplace vs Amazon which is more about physical goods (again, excluding AWS, which is technically a service). Most of their value is putting all the work into the ground to keep the marketplace balanced, which is a tricky marketing and econometrics problem. One need not look farther than Lyft to see how hard it is to keep the "5 minutes away or less" guarantee.

Also to those who think the app is a non-trivial technical achievement, I would recommend reading some of the blog posts that go into some of the crazy technical challenges they hit [1]. Specifically in some cases, in order to make the app work in all geo's, they ran up against practical limits to binary size at Apple. Not to mention that geo / waypoint data is a genuine "big-data" problem and not easily reproduced by just any company.

[1] https://blog.pragmaticengineer.com/uber-app-rewrite-yolo/


Their barrier is definitely not their app.

However, I would not want to own a lot of Uber shares with the way FSD is progressing since v12.


So if MS-DOS 4 was released in 1986, and it is now 2024, that's a 37 year gap between release and open source.

That means Windows XP should be open sourced by ... 2038. Not as far away as it seems. I'll add it to my calendar.


I doubt Microsoft would ever open-source any NT Windows versions because the current ones are based on the same code, just with added touchscreen nonsense, adware, and overt contempt for the user.

We may see Windows 9x open-sourced. But then again, it's a stretch because Win32 API is still in wide use today. Releasing the sources for 32-bit Windows versions even this old may have an adverse effect on Microsoft's market domination.

But maybe ReactOS will reach beta by 2038. Does this count as an open-source version of Windows XP? :D

If you really wish to look at XP sources and don't care much about the legal aspect of it, you can do so right now. They were leaked.


> Releasing the sources for 32-bit Windows versions even this old may have an adverse effect on Microsoft's market domination.

I disagree that releasing Windows 9x source code would have any impact on MS market domination.

> I doubt Microsoft would ever open-source any NT Windows versions because the current ones are based on the same code

Nowadays releasing something NT like XP may seem crazy. But in 15 years it will be so far away from future Windows, that it won't be that crazy.


> But in 15 years it will be so far away from future Windows, that it won't be that crazy.

It's not like the NT kernel will be going away from current Microsoft products anytime soon.


NT sources leaked, same for 2000. There is also leaked DOS 6 beta. The only thing releasing stuff this old brings is nerd goodwill.


All open-source projects that deal with reimplementing parts of Windows, particularly Wine and ReactOS, consider those leaked sources radioactive and would not accept any patches if there's even a slightest suspicion that the patch author gleaned anything from those sources. Those same sources officially released under an open-source license would change that.


I wouldn’t assume Microsoft execs view increased capabilities to run windows programs in Linux as a bad thing, when they think about the matter at all. They would certainly prefer that such a capability be developed by someone else, so they don’t have to support it.


> I doubt Microsoft would ever open-source any NT Windows versions because the current ones are based on the same code, just with added touchscreen nonsense, adware, and overt contempt for the user.

Initiatives like MinWin and OneCore, secure kernel, device guard,... caused lots of rewrites and moving code around.


2038, you say? If your calendar is based on Unix epoch time, then ensure that you have upgraded to 64-bit timestamps before then.


ReactOS will still be buggy AF by then I'm sure. I had hoped they'd at least have it to Windows 2000 alpha levels by now.


A lot of XP components are still in use in modern windows, whereas DOS was completely replaced around the time Windows XP came around.


Around the time Windows 2000 came around.

Up to Windows 3.11 it was a GUI on top of DOS. Windows 95, 98, Me used DOS to boot and it was still possible to stop the booting process at a DOS prompt (although in Me this was no longer official). Finally Windows 2000 had nothing to do with it as it is NT based.


Windows 2000 was part of the professional NT line, though, and was the companion of Me for the millennium releases. As far as I know, 2000 wasn't marketed to home users. I think what the comment you replied to is saying is the the transition away from DOS wasn't completed for both professional and home markets until XP, which unified everything under NT for all markets.


Around the year 2000, I was studying computer science at a university. Most of their PC's ran on Windows 3.1. I was using it at home. But one day, Microsoft sent me an offer: I could purchase the student release of Windows 2000 workstation for a mere $25.00. I went for it, and found it better than the Windows NT nap-sayers at school said. I don't know why I was contacted. Probbably because of other Microsoft programs I'd bought at the student bookstore.


Windows 2000 was a pretty great OS. Used to enjoy using a Litestep shell instead of explorer. While it wasn't great for a lot of games, many did run fine. I liked it a lot better than OS/2 that I ran previously.

I generally ran 2-4x the amount of RAM as most did. Still do. Pretty sure this made a lot of the difference.


Hey, Listestep what a blast from the past :)

I rain it until it wouln't run sensible anymore in Windoes 10. I then ditched Windows for Linux soon after - I can recommend KDE Plasma if you want to have something thats sorta configureable enough like Litestep was.


I remember running both litestep and windowblinds. I can't remember which one I liked better.


windowblinds is a window decoration customizer - LiteStep does nothing of the sort :) LiteStep completely replaces explorer.exe as the shell host and you can then customize what functions you want to have in your UI. The windows themself would stay looking the same.


Windows 2000 Pro was what I used at home for a long time and it was great. NT 3 and 4 were absolutely terrible which might explain your NT naysayers at school. I never once had to reapply a service pack in Win2k


Still remember the first time I touched Windows NT 4. Half an hour into work experience: Opened up a printer dialogue set a setting that hard crashed the PC; then slowly every other PC in the building as soon as they tried to print (i.e. just as they had _finished_ whatever they were working on; but often just before they _saved_ it).


I liked NT4. The only reason I upgraded to 2000 was for a newer version of directx (6.0 I think?).


this is accurate; the 2000 line targeted business, and if you remember having a consumer computer with 2000 pro it didn't support a lot of hardware.


Can confirm. I upgraded my 98 box to 2000 and never did get some of my hardware working. When I told people I was using 2000 everybody assumed I had stolen it from work. I didn't. My friend stole it from work and shared it with me ;-)


A license key of 11111-1111111 worked, if I remember correctly. :-)


Nice part of that pain came when XP was released. Win 2000 drivers mostly all happily loaded into Win XP !


Drivers were kinda a mess from what I remember in 2000 especially on the graphics card side of things. The HW vendors needed more time to switch over.


Tangent, but Windows NT had a POSIX subsystem for a while.


Kind of for a very long while. You then had a descendant SFU from some SP of NT4 to XP / Server 2003, then a further one SUA until Windows 8 / Server 2012. With some code flowing between various companies. I think SFU still used the Posix NT subsystem core. Probably also SUA, although I'm less sure. Not really the case WSL1, though (although probably the core NT kernel was more ready to support it, thanks to its history).


I knew about SFU, Services For Unix. What was SUA?


Subsystem for Unix-based Applications also known as "Interix"


From NT 3.1 until Windows 8.0. Windows 8.1 removed it, and Windows 10 offered WSL1 as its replacement.


Windows 9x and ME, yet used bits Iog DOS beyond bootstrapping. They were using config.sys to load drivers


Windows 2000.

Also NT4, NT3.51, NT3.5, NT3.1...


Even ancient Windows includes many 3rd party libraries. I would not expect any Win 9x or NT 3.51+ version of Windows to be open sourced in it's entirety. I hope I'm wrong.


Yeah, just the font stuff was such a mess. I’m hopeful someone will power through those problems.


I'm more interested in them open-sourcing something from the 3.x/9x line.

NT seems to have been far more studied, and of course there were the infamous leaks of those along the way.


We need to wait for the NSA backdoors to expire first ;)


I have a lot of tech / healthcare couples in my friend group and this is definitely the arrangement. Especially true since healthcare careers have very different timelines than tech.

In once instance, one partner is a clinician and absolutely has to be on site 5 days a week, not counting on-call. The other works 100% remote but the company is global, so depending on the week they may be on Europe time or Australia time.

This is definitely a work arrangement I couldn't have imagined being common 10 years ago. This shift will likely be one of the defining economic changes between the 10's and the 20's.


> Engineers, saving your program time and money out of the sheer laziness of not wanting to make a new XML format for an instrumentation project. This is how progress is made in the world, I guess.

I've worked in healthcare, fintech, and ads and this is one thing I've done in all three fields. I swear i've written or debugged XML parsers in 20 different languages at this point just so I didn't have to get consensus on a new format.


We made our XMLs with, horror of horrors, a Visual Basic script that ran in Excel and digested several input documents to generate a map template that we could then tweak by hand and turn into an XML through another VB script.


Honestly, makes sense. This is how much of finance runs their models.


We weren’t allowed to have any other real programming tools, and the telemetry “maps” we were trying to make were/are major/minor frame oriented. This maps nicely to a grid of data: a spreadsheet.

IRIG 106, Chapter 4 PCM telemetry covers what we were doing in this process, along with Chapter 9.


I feel your pain. I've written entire applications in Visual Basic in Excel onboard the CVN before. It was the only programming language I could get access to.


Wait, what? Why weren't you allowed to have real programming tools? I mean, I see a lot of FTEs using Excel, but I thought it was a matter of familiarity. I've also encountered some who are using Matlab, SAS, Python, R, etc.


US DoD heavily controls what can and can't be installed on their stuff. To make matters worse each branch and organization has their own approach to how they control what gets installed.

Sometimes it's just the path of least resistance to use what you already have.

I remember putting in a request to install Python. It took me 6 months to get a response of no. I had the opportunity to appeal with more information on the use case, but I just did it in VB for Excel at that point.


While that's true, FTEs (Flight Test Engineers) tend to have more leeway. As I said above, I've seen Army S6 give approvals for all sorts of programming environments. And the AFTCs seem to be a bit more lenient than that even when it comes to deploying in SCI environments.

Granted, I've also seen a piece of software denied because it has USB in the name (even though it had nothing to do with USB), so YMMV.


Oh boy, were you stuck with NMCI devices? At NSWCCD we at least had an RDT&E network we could use...


If you're interested in a visual of how different welds interact with the metal, I watched this video on SpaceX's Starship welding that was helpful [1].

Something I try to remember is that most non-organic materials gain their properties from a precise lattice structure. This often doesn't come across when looking straight at the chemistry equations. Welding both disrupts and rebuilds this lattice structure, which is why it requires so much skill to do at volume.

[1] https://www.youtube.com/watch?v=CP8Hbr2jL_c


I jumped to 3:16 and was very confused for a moment.


Only tangentially related, but I highly recommend watching Veritasium's YouTube video on electricity if you're curious as to how Maxwell's fields create the current / amp abstractions in EE [1].

It's a common misconception that electrons or current transfer energy. In reality it's the electric field that exists between the wires that is doing the heavy lifting, the electrons in the wires are just controlling the field.

This has always confused me and I was very irritated when I first learned electromagnetics about how rote all the initial learnings are. I wish more work was put earlier into making everything relate back to Maxwell's equations to make it make sense.

[1] https://www.youtube.com/watch?v=bHIhgxav9LY


I always explain it to people like waves in the sea. The bobbing up and down isn’t the water molecules travelling they are simply going up and down as the wave moves through the water. People seem to accept this analogy as, even if the water thing is new information to them, it’s easier to visualise.


That's not what PP and Derek Veritasium Muller are harping on.

It's not about the misconception about "AC is vibrating so how can electrons be delivering their energy from the power plant to the light bulb far away?"

They are talking about how the electric field is outside the wires almost entirely.

Their argument is that in the water wave analogy, the wave wouldn't be in the water at all, because it's "actually" transmitted via an invisible field in the space above the water, which pushes back on the water farther away.

Most respected electricity/physics YouTubers disagree with Veritasium's emphasis on this perspective, by the way. The think he conflated the first misconception I mentioned with the second idea, which is about how you model electric circuits.


It's a practical simplification, not a misconception.

It's the same argument as "Einstein corrected the misconception that Newtonian mechanics is how bodies interact, and it's irritating how rote mechanical engineering of a car is."


FWIW, I don't think the author is talking about executives. I think he's talking about people on the level of ultra high-net-worth individual (UHNWI). People with roughly $30m or more in investable assets.

People like that prefer to be focused on exactly whatever their building, and babysitting the money is rarely that thing. They often approach their private wealth manager with a broad plan on goals, and the PWM goes out and executes them with light direction. There are lawyers and accountants in the loop as well, all working on paper to manage the cash pile. It's a long term, multi-generational relationship that often involves multiple parties, one of which is often some or many governments.

In addition, the PWM in this position prefers to get the client directly on the phone when possible. This is both to manage the relationship, and to sell novel finance products, often ones that aren't available on the open market.

Until we get AGI, it's just not quite there to be outsourced to AI. People at that level really prefer having a person be responsible.


For those who don't know, Josh Brown runs Ritholtz Wealth Management and hosts The Compound Podcast [1]. Probably one of the most insightful finance podcasts that covers a wide range of topics.

Highly recommend subscribing.

[1] https://www.youtube.com/@TheCompoundNews


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: