Hacker Newsnew | past | comments | ask | show | jobs | submit | more Instantix's commentslogin

On the other hand we have the proof that an Airbus can hit another plane when landing and deliver everyone fine.


It's honestly astonishing how well the A350 held up for evacuation, which supposedly took 10 minutes. This is the first hull loss, and it actually improves the safety record.


Allegedly, some of it can be attributed to the extensive use of carbon fiber in the A350s fuselage and wings.


What can be attributed to the carbon fibre?


The plane faring better in the collision than expected and, allegedly, not burning through quickly and buying the passengers more time to evacuate.


My naive assuming would have been that carbon fibre would burn easier than aluminium


Well, yes, the plane did burn down almost completely. The quote I heard that complimented it wasn't that it wasn't flammable, but that it (again, allegedly - I'm not a material scientist) insulated the interior from the fire outside for long enough.


I agree for an ideal world but the problem is often the wall of the reality i.e. TIME. Levelling the playing field takes time and it's not doable when you have to deliver now, not in a month. So playing at a level sustainable by everyone on the long run is certainly a better and more robust approach.


Right but managers have the responsibility to validate everyone has not only the right level of urgency but that has it in the right thing. The natural thing to happen is small deviations escalating to misalignment and later lack of trust. Managers should check on the small deviations and gently (social skill) help teammates stream their energy and efforts to take the mission to success. And whatever the major constraint is for a given team, every business needs to de-risk their engineering investments and nothing does that better than a great team culture and great vibes among teammates. If managers aren't the protectors of that I see them as either incompetent, failures or frauds.


And perhaps the code was bad just by using a functional approach which nobody master. QED


Yes, that's a likely cause - a newbie to functional programming shoehorning a functional approach into a language actively fighting against it.


Are you sure those "smartest developers" would be still smart if they have to maintain someone else abstract code? Easy to memorize layers of abstraction when it follow your own logic and you build it step by step.


Still happens. I work in a codebase that is definitely on the upper end of complexity for the industry. ~18 months ago we hired a very smart developer who had one of the more rapid onboardings to our codebase I've seen, he was very productive in under a month. It turned out he was too productive and people were starting to find new layers being added to the code base. A few of the more senior folks had to come in and shut him down.


> shut him down.

Does this mean firing him?


No, it means helping him understand how certain types of contributions he makes are perceived by engineering leadership and giving him access to mentorship that would help him better channel his efforts in ways that would be seen as more productive and something to be rewarded.

If, over time, he continues to introduce more problems than he solves then he might be managed out but I'm hopeful that won't happen.


Bingo. It's different when you grow up with it.


Thanks for this post, having a VCS at the time, I was aware of the existence of this cartridge but never get a chance to had one. Today I can finally have a taste of it.

What was great with all those Atari cartridges, was the illustrations. I wonder who was the artist(s). Always inspiring sketches and the guy which is drawn in this manual is not out of place. A real piece of art.


The artist for the image on the basic programming cart was Rick Guidice. He did a lot of art for Nasa in the same style. [1]

A big reason video artwork from that period is so interesting is that the game graphics were limited, forcing a lot of artistic interpretation. Semi related, I recently found this, which seems like a futuristic reimagining of the original Atari style. [2]

Atari was a real pioneer in this space. They got their start doing arcade games for the midway. Which necessitates having eye popping attraction art to draw in players and collect their quarters. The quickly developed their own style which was pretty much copied all over the industry.

There's a great book called Art of Atari which is I highly recommend if you are interested in the history of this stuff. [3]

[1] https://www.rickguidice.com/nasaart/nasaarti.html

[2] https://arcadeblogger.com/2022/12/04/what-if-ataris-pong-was...

[3] https://www.amazon.com/Art-Atari-Tim-Lapetino/dp/1524101036


Sometime frameworks are a real help, sometime using it is just making thing bloated and it's hard-linking the future to someone who have the knowledge of the framework.


I would much rather “hard link” the future to a publicly available documented framework than one that a single person who thought their problem was a special snowflake wrote.


Which is why I've been paid good money to both maintain and bring up to speed old RoR applications that were so out of date you had to manually patch the C libraries just to get it working.

This attitude is common, that these frameworks are not, themselves, dependencies to be managed and protected from.


I didn't know that transputer were made for the Amiga. Some years ago I did some research about the Atari one and the conclusion was: inefficacy. The serial link linking each transputer being a bottleneck. And more transputers you add, more complexity in tasks distribution you add so the performance curve was flattening.

I'm curious to know if multi-core processors were designed with this lesson in mind.


The Atari Transputer system was unfortunately hampered by a number of design factors. Transputers did have a bottleneck with inter-chip communication, but generally you programmed to the transputer architecture, rather than a general purpose computer architecture. This meant you broke up your data in to chunks, and then let each transputer do its own thing, and then regroup at the end, perhaps using a separate transputer to reassemble the data, much like you would with a modern GPU. You could treat your functions either like a pipeline or like a matrix, but you had to create the algorithm to work with that architecture. Part of my job was showing companies who were deploying transputer based systems how to port their work to this architecture. That said, the transputer to transputer communications bus was pretty optimal, for the time, compared to how most multiprocessor computers handled their work.


The Spectrum was first a very beautiful object thanks to Rick Dickinson. And, as a side note, he proved with the Spectrum Next, he still have all his talent.

Frankly, to have this object at home in the 80 was classy! A work of art.

And for me it's what define the Spectrum. The main games were not the most beautiful but had a very distinctive and enjoyable touch. A tiny machine but which a lovely and specific spirit. It was a so nice time in my memory.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: