Hacker News new | past | comments | ask | show | jobs | submit login
The relationship between mindset and getting old (nautil.us)
347 points by dnetesn on June 11, 2017 | hide | past | favorite | 158 comments



I wonder how much the effects vary between different professions.

I'm in my 40s. Incredibly old for HN standards. And yet, I feel no nostalgia for the "good ol' times." I mean, don't get me wrong I'm sure there's a lot of things that set me apart from newer generations -- I don't get Snapchat at all ;) -- but I don't see me being happier by being put in a house set up to look and feel like the 90s/80s.

Is it maybe because we as programmers tend to be less prone to be stuck to the past? Just wondering


As a 40s hacker/entrepreneur what I miss most from the 90s was the feeling of having a vast unexplored frontier with endless possibilities ahead of us even for the little guys. These days, with the web and mobile revolutions maturing, it feels like the 5 or so giant tech monopolies have locked up most of the future potential. But maybe that feeling is part of having an older mentality.


Computing has become so banal. We used to be working on important problems. "The best minds of my generation are thinking about how to make people click ads. That sucks." - Jeff Hammerbacher, formerly at Facebook.

(There are more people working on important problems than in the 1980s. Computer science research used to be about a hundred people each at MIT, CMU, and Stanford, with a few smaller groups elsewhere, plus internal efforts at IBM and Bell Labs. The whole field was tiny. Now, it's larger, but overshadowed by the massive level of activity associated with ads.)


>Computing has become so banal. We used to be working on important problems.

1. Blockchains/smart contracts,

2. Garbled circuits/Snarks/MPC (Multi-Party Computation)

3. IO/VBB Program obfuscation,

4. FHE (Fully Homomorphic Encryption),

5. Machine Learning/Vision,

6. Global/Solar-scale performant and secure routing protocols,

7. TEE (Trusted Execution Environments),

8. Advanced P2P systems like IPFS,

9. Bioinfomatiks.

...


I don't understand what you are trying to say with your answer, as the person said "we used to be working on important problems" and you responded with a list of random technologies. Technologies can sometimes be "problems", and sadly often are :/, but that means that they were failed solutions.

A list of hard problems we could be tackling: 1) the world is going to run out of fossil fuels, 2) we are destroying the human ecosystem by global warming, 3) there exists a very large amount of inequality between the upper and lower class in our society and the gap is only increasing, 4) we have more and more humans of whom society demands "work" to get "pay" in order to survive even as we come up with ways of replacing more and more "jobs" with "automation", 5) there are many subsets of our population divided by axes such as race and sexuality which are discriminated against by others in both direct and indirect systematic ways, 6) we have a limited number of antibiotics that are generally safe for widespread usage and pathogens are adapting, 7) for numerous and potentially diverse reasons an increasingly large fraction of our society is being turned off of science and has stopped believing in basic things like the benefits of even our oldest and most trusted vaccinations, 8) humans continue to die from diseases like cancer, 9) governments and companies have begun to usher in a dystopian era of surveillance under the guise of protecting us from terrorists and spam and serving us advertisements.

A couple of these problems can be addressed with the technologies you listed, but even in the core of some of these communities that want to address problems 5, 7, and 9 you honestly just end up finding a lot of people who are exacerbating problems 2, 3, 4, and (annoyingly) 9.

I despise the cloud :(. It was just so much harder for people to abuse the crap out of us when the concept of a computer was something that, even if it could connect to other computers to get information, was not something that fundamentally relied on other computers and which stored all of its information on other computers and could be remotely controlled by other computers. We are to the point where arguing that I "own" the device on which I am typing this message almost doesn't make sense: I am borrowing it from Apple and I can only hope that they don't screw me too hard :(.


Many of those are problems, not technologies. Saying Blockchains is just a technology is like saying Operating Systems or Compilers are just a technology. They are an active area of computer science research.

Many of the problems you listed are likely only solvable by large scale social movements. Solutions to important technology problems change the landscape over which progress is made but they are not social movements themselves.

>It was just so much harder for people to abuse the crap out of us when the concept of a computer was something that, even if it could connect to other computers to get information, was not something that fundamentally relied on other computers and which stored all of its information on other computers and could be remotely controlled by other computers.

The cloud is the new word for mainframe. The PC pushed things away from mainframes then the internet/cloud pushed back. Blockchains are interesting space between these extremes.


I am pretty damned certain that the "important problems" being referenced here, and the ones we thought we would be able to affect in the heyday of computing, were not "we need to learn how to make a faster compiler" but "we are going to change the world". You seem to have conflated "interesting" with "important".

If anything, we agree on one point: that computers failed to solve those problems, and where we thought they would--Twitter being a great example--they quite arguably made the problem much much worse.

That is why I will argue computers feel so much more depressing today: we have been slowly coming to the realization--not just in the past few years but since at least the 50s (if not the turn of the last century)--that the entire concept of a utopian technofuture is probably a fantasy and dystopias now seem so close that we barely find the idea compelling to talk about anymore.


So why don't you start to tackle them? Problems get solved when real actual people do hard work, not when some ephemeral "they" decides it is time.


Meanwhile people have to eat.


Well, we can eat and enjoy a lot of things we enjoy because people who also "had to eat" and also "had families and mortgages" did their part, and even sacrificed their lives, for making things better.


Just nitpicking a bit about something I'm passionate about: Becoming a multi-planetary species should at least be on that list, if not first item.


To clarify for anyone who isn't convinced:

Not dying should be the first on that list, for obvious reasons, and becoming multi-planetary reduces the chance of (all the) humans dying exponentially for each planet colonized.


To satisfy your imagination, let's coerce everyone to believe we can be a multi-planetary species, nevermind that we only know of one planet that support us without extensive engineering effort.

Are we even the same species with any of the same concerns by the time we get to Alpha Centauri planet?

These notions still coming from the mouths of my generation are getting a bit bonkers to me as I age.

Let's work towards a goal we'll never be able to validate actually happens, as we'll be dead before we get close. Let's build a system that coerces people towards that goal.

Let's chew up more and more of this planet researching and building towards technologies and fucking over the next gen of humans here.

Because a generation that grew up watching Star Trek wished it would happen real bad!


What you are misunderstanding.

1. The earth will still be inhabitable in the short run. Even with worst case global warming x10. It will be a lot different, and probably not better, but we can still survive here.

2. 99% of human history people lived without liberties or human rights that became a theme of anti-autocratic philosophies and cultures that grew out of The Enlightenment to make the modern day world possible. Alan Turing would have been tortured to death in earlier years. These last few hundred years may very well be an anamoly, and while we know humans exhibit a tendency towards immoral, autocraric leadership, we should take advantage of the fact we live in the best 0.1% of time in history to develop such technology. It may be that in the short run is the only foreseeable possibile time to become inter-planetary.

3. Competition is what propels human progress

4. Global warming bureaucracies have the same problems the War on Poverty, War on Drugs, and prohibition democracies have. All these problems could be solved by your standard intelligent young adult, but they aren't because of bureaucratic inertia and politics. They are problems of people and human culture of greed.

5. I don't watch Star Trek


Software is the new means of production.

While management is cluelessly guiding what software is developed, we'll never put the requisite effort into those problems.

They want return on investment, not idealism.


It's hard to be excited about that list when so much of it seems like technology that will be used against me. For example, by businesses to further reduce the freedoms we have when using proprietary software.


I'd look at research efforts that have been focusing on revising the heart & original spirit of computing. YC's HARC especially - harc.ycr.org

Bret's talk about their group's vision for computing - https://vimeo.com/115154289 - really helps people who think computing is "done" and all the interesting stuff is figured out.


Most people here are simply users of these technologies, not creators. The actual programming is still banal.


Like with all the sciences, the role of the individual has been diminished. Interesting technology may be seeded by some individual genius but it takes a huge number of people coordinating to build something like a self-driving car, a breakthrough AI, the LHC. And it takes a lot of really smart people spending most of their time doing relatively banal tasks to pull these off. I think the bar for innovation is just so much higher than it used to be, there is so much knowledge required across so many fields that you either end up a generalist, who rarely gets a chance to dive deep into any one thing, or a specialist who is forever stuck in their one area of expertise.


How many people are working on these problems?

The sad reality is that most software engineers want to work on these problems, but instead have to work on pumping out CRUD apps and proprietary APIs of questionable social value to pay the bills.


Who is working on better routing protocols? Displacing BGP seems like a monumental task so it seems like bgpsec is about as good as we are going to get. Would love to be proved wrong though.


>Who is working on better routing protocols?

Scion is a clean slate internet architecture research design (including routing)[0].

>It seems like bgpsec is about as good as we are going to get.

I was author on a paper [1] that reduced some of the downsides to the RPKI (the PKI BGPSEC relies on).

There is also interesting work on getting most of the security from BGPSEC without complete BGPSEC deployment [2].

In a different direction, as we build the internet of the inner solar system we will need protocols with different properties than those we needed for terrestrial networks.

[0]: https://www.scion-architecture.net/

[1]: "From the Consent of the Routed: Improving the Transparency of the RPKI" http://cs-people.bu.edu/heilman/sigRPKI_full.pdf

[2]: "Jumpstarting BGP Security with Path-End Validation" http://dl.acm.org/citation.cfm?id=2934883&dl=ACM&coll=DL&CFI...


I've seen scion before but IMO "clean slate" protocols are DOA when it comes to displacing BGP. Just look at the mailing list. A protocol with an integration model of forming an overlay has no integration model. ETHZ and Co have been marketing that at all kinds of academic conferences and it's seen approximately 0 uptake outside of organizations that volunteer to run it as an experiment.

Sorry about the rant, but I get the impression that the authors of replacement protocols like these are more interested in becoming academically famous for being the inventor of the Internet (i.e. The next Vincent Cerf) rather than proposing solutions to existing systems.

I would be interested in new protocols for super high latency networks (i.e. The solar system model).

The RPKI work and path end validation work is interesting and I haven't seen those papers before (been disconnected from publications recently). Thanks for the links.


Bioinformatics is very often pretty basic counting and basic statistics applied to moderately large datasets and no concept of best practices.


yes...and important problems that we have barely begun to solve


The funny thing is Hammerbacher worked on trading problems which have tons of super fun CS problems even if it is a bit nihilistic making rich guys richer ... then founded Cloudera, which, I dunno, seems pretty "beige" as far as CS goes. Didn't realize he did a stint at FB. That could suck the life out of anyone.

Physics has the same problem. Upwards of 20k people in the APS... most working on very obscure problems spraying "shittonium on silicon 111" or making no progress in various theories of everything.

Still plenty of important and interesting problems in CS, some of which I've been lucky enough to work on.

Blockchain stuff is pretty useful and interesting, though it could all fall apart with some breakthrough in crypto.

Machine learning is mostly used for stupid stuff like getting people to click on ads, but there's lots of interesting use cases for it which haven't been explored yet.

There's also a lot of work to be done on the core tools used in ML; while everyone babbles about deep learning, boosting, PGMs and topological data analysis still have a lot of low hanging fruit IMO.

One that people don't think about enough: non-standard computing architectures. Quantum computing hasn't produced anything of note yet, but it's hardly the only potential area of research here. Simply using stuff like old school Harvard architectures has tremendous implications for security (no more buffer overflows, yo), but nobody bothers thinking these things through and implementing.


> most working on very obscure problems spraying "shittonium on silicon 111" or making no progress in various theories of everything.

That's the problem with foundational research: It always looks obscure and impractical until suddenly there's a huge breakthrough out of nowhere. Same dynamic as with startups, where 99% will never make a significant mark of any kind, while 1% change the world forever. And you cannot know in advance which startup (or which foundational research) falls into which category.


Surface science (shittonium on silicon 111) has been promising to explain catalysis for 40 years now... it's still important to do, makes the chips run faster, but it's usually not considered real foundational.


>Blockchain stuff is pretty useful and interesting, though it could all fall apart with some breakthrough in crypto.

At this point it seems unlikely that a breakthrough in crypto could kill blockchains without killing nearly all of modern cryptography. For instance we can build blockchains which are secure even against quantum computers. We can build blockchains that make no number theoretic assumptions using only secure hash functions. Any technique which can break all techniques we have of inventing hard to solve problems (i.e. cryptography), would have a massive impact on technology.

A breakthrough in crypto that would destroy blockchains is much more likely to be a technology which is significantly better than a blockchain for what we want to do with blockchains. That would also be a exciting result.


I was thinking something along the lines that hashing proofs aren't very rigorous. I mean, it all looks pretty good, and I have no idea how to break this stuff (not my department), but some weird topology guy could wake up one day and discover that hashes aren't as good as they looked.


Nonsense, we've been working on payroll registers for decades. You just remember the fun stuff.


That's possibly false: the best minds are likely working at places like Genentech (no relation to my world) and other places where truly useful things requiring deep knowledge of multiple technical domains is a prerequisite.


SpaceX?


SpaceX is doing more or less the things that have already been done 50 years ago by truly the best minds of that generation, except they're cheaper and run as a private enterprise. They're standing way up there on the shoulders of those giants, and we should recognize that.


Making things cheap is just as important as inventing things. Nobody would care about computers if they still cost $10m and took up a whole room.


We're not talking that kind of price differential. A factor of 5x at most, with worse specific impulse than the Russian engines from the 90s. I get it, Elon is the second coming of Steve Jobs for many, and it is admirable that he's pursuing the fields that are hard, but he's not the second coming of Sergey Korolyov or Werner Von Braun, or even the countless others who made the space age possible.


Well, most folk here still would. Point valid, nonetheless.


Yet nobody else had landed and reused the first stage of a rocket. They are doing truly novel things.


Didn't NASA do that in 1996?


If I understand correctly, the Space Shuttle's SRBs were recovered, refurbished, and reused starting in 1982?


Yes, though they didn't make controlled landings, they used parachutes.


Yes, all those rockets landing their first stages back in the 60's, that was quite boring and old. Oh, wait, they didn't do that at all.

> They're standing way up there on the shoulders of those giants, and we should recognize that.

We do. And they do too.

But they are innovating, and bringing down the cost of putting stuff in orbit is very important progress, no point in belittling that.


It wasn't because it was impossible. Shit, those guys landed on the Moon and Venus with nothing but slide rules. It was because government is not spending its own money, so it can afford five times the cost.


Landing on the moon is a bit different than landing here on earth.

If you wanted to say that SpaceX had done 'nothing new' then you're about a year late to the party, they are definitely innovating.



You only have to look at WiReD Magazine from those times (ca. 1993-2000) to appreciate how different the zeitgeist was during the personal computing revolution. Everything was changing and anything was possible.

Personally I think it's not because we were working on great new tech products then. I think it's because we knew we were at the forefront on a tsunami of a tech sea change of historic proportions that was obviously reinventing much of everyday life in ways we didn't yet understand, from the global public commons to the local grocery store. The question on everyone's lips was, where will it end? What's happening? Where is the world going next? And can I play?

Then from 2005 to maybe 2012 we saw an apparent renewal of tectonic forces as mobile smartphones surged forth and all of us became incessantly and globally interconnected.

But now, in today's hindsight, we've had some time to get perspective, and we're disappointed. Where was that brave new world we were promised? We've gained tweets, social and news feeds, e-books and e-tail, and handheld GPS. But we've never been more aware of how meaninglessly trivial tech's ultimate tsunami -- small talk, or how thoroughly manipulated we've let ourselves be by corporations whose highest calling is to sell us ever more trumped-up commodities and ever greater dependency on their latest bit-of-the-nonce drug fix.

O where have you gone, Gene Roddenberry?


Same here. 90s were me working my butt off to get a Ph.D. which I didn't end up using, and watching my pals who worked at X, Y, Z cashing out. Some very good times were had for sure, but I don't feel nolstalgic about times when I was eating ramen, working 36 hour shifts at the synchrotron and driving a $200 car.

I do feel a bit nostalgic about the 80s; the music, the big hair, and feelings of immanent nuclear doom, and the eventual fall of the Iron Curtain: a very unique feel to this era. Granted I was a teenager for most of it, so maybe that's why.

As for age: the main thing I've noticed is my hairline and my joints aren't what they used to be. Exercise, eating right and occasional fasting seems to stave off msot of the physiological effects of aging. I'm pretty sure working in a creative field (and being around a lot of younger people) helps a lot with the psychological aspects.


The feelings of imminent nuclear doom were the best. really made the pop culture special


(Another 40's hacker/entrepreneur here)

Something I try to continuously tell myself is that "It's still early days" because while it doesn't feel like it - I think it is.

Look at the massive shifts we've seen in social media use (Friendster > MySpace > Facebook + Twitter > Snap) - at near any point in their usage you could have reasonably said: "I don't see how anyone could overthrow one of these - they're too big"

There is so much future still to invent.


IMO, each generation faces different landscape of software and hardware infrastructure and end-device. The existing social net work and communication solutions can't meet the delta of this generation's needs against previous ones, hence there comes the big unmet needs from the emerging generation. However, solution builders have to be close to this generation enough to have enough sense/instinct of the delta to build products that can actually cover those delta.

Just my observation and not being described accurately and clearly enough here. But I hope you get what I'm trying to say.


Search went through many shifts, but hasn't since Google. When a new technology matures, someone eventually figures out how to dominate it, and does.

The difference with computing was on-going Moore's law (forever young), which now slows/ends. Worse, though they still shrink, the cheapest node remains at 28nm. Cheapness fuels revolution. Peak silicon has passed.

We could get a new technology (like fracking etc and peak oil), as Kurzweil suggests for the singularity.


>When a new technology matures

I don't believe search is mature. We haven't seen a lot of innovations in web search for a while and what we know as web search is probably the best Google can do. It's probably not the best possible web search though. Just the best from Google.

The reason nobody is there to challenge Google is because noone has come up with a way to fund a web search engine other than through ads. Challenging Google probably also means challenging the whole ad industry. It's a rough climb.


Also 40s hacker/entrepreneur here. Your complaint is like a 40-year old in the 90s complaining that IBM, Microsoft, Intel and the like have locked up most of the future potential of computing.

The internet titans of today will be replaced or be forced to change as the next wave of innovation takes over.

There may be a lull right now, but either AI or augmented reality looks like a good candidate for the next cycle.


I see your point but I also think that in the 90s the computing world was more open and accessible. I guess that happens to most maturing industries. The market players become bigger and start aa dominating​. Around 1900 a guy in a garage had a reasonable shot at building a car in his garage (or an airplane) and start a business. Not possible anymore. You need millions to get even started.


I was a teenager in the 90s, and the consensus among most of the adults I knew working in tech was that it was over. Microsoft had won. Programming was a commodity, all the tech jobs would be outsourced to India, and you were much better off getting into finance.

Remember that towards the end of the 90s, Netscape - the one promising new startup that threatened to break the Windows monopoly - had just had its ass handed to it by IE, with some spectacularly underhanded tactics. All the dot-coms had required hundreds of millions in funding and were rapidly blowing through them, with no sign of a profitable business model.

Hope didn't really return to techies until the mid-2000s, when a bunch of events (Google's IPO, acquisitions of Flickr & del.icio.us, development of Rails/Django/JQuery/MySQL, and founding of Facebook, Reddit, and YCombinator) made people realize "Hey, maybe it's not the end of history after all. People have actually been doing pretty cool stuff all this time when we weren't looking."

2015 seems like it was the new 2000. That'd put us around 2002-2003 right now, complete with the huge focus on P2P, security, and government overreach, and with a couple years to go before we realize that not everybody gave up.


Interesting perspective. In the 90s I worked on handheld devices like the Apple Newton and that a lot of fun and there was a lot new and exciting stuff happening in that area. It seemed a much smaller world where you could keep track of what's going on.


Yeah, there was tons of interesting stuff happening in R&D in the 90s. The problem was that Microsoft held a complete strangehold on distribution, which meant that for the average consumer (i.e. teenage me, not in Silicon Valley, and everyone I knew), it might as well not exist. I remember, when I first started to read academic papers and surf programming sites in college around 2003, thinking "Holy shit, there's all this cool stuff that was invented over a decade ago, and I can't use any of it unless I switch to UNIX."

What happened in the mid-2000s was that open-source software (and later Apple) finally broke Microsoft's hold on the route to the consumer, and suddenly small teams of people could deliver software that billions of people could use. There's a similar situation in R&D now; I remember that when I left Google 3 years ago, there was a whole lot of really cool stuff being developed that has yet to see the light of day. No corporation can hold innovation back forever though, because of Joy's Law: "No matter who you are, the majority of smart people do not work for you."


But that wave of innovation was fueled by Moore's Law... now deceased.

Mature industries, absent disruptive technology changes, tend to consolidate and be dominated.


I have a somewhat contrarian prediction for Moore's Law. I think the next wave or two of innovation will obliterate the current progress of Moore's Law.

Yes, Moore's law may seem to be slowing down right now, but all you need to do is look at the size, energy consumption, and ability of a human brain to realize that nature has blindly created a much more advanced "technology" than what we currently have.

Next, look at flying. Birds are nature's most advanced flying machines. We've been able to expand on that to the point where we have jets and even reusable rockets that can go a thousand times faster than the fastest bird.

Right now, computing is at the "man puts big flaps on his arms and jumps off high ledge" point. Once we figure out the "first airplane", AKA wright brothers moment, Moore's law will explode.


It means it's time to start looking for the next revolution.

Before the web, there was the PC revolution. And...microcomputers before that, I think?

It's true, computing technologies mature and there are fewer opportunities for real break throughs. But then new technologies arrive for the cycle to start over again.


That's how biotech, in particular neural technologies, feel to me right now. In reality, it's still such early days there that it's probably closer to computers in the late 70's than the web in the 90s, but I think the spirit is similar.


I would say the opposite. You used to have to do everything yourself, but now you can leverage the infrastructure of amazon/google/microsoft/... to do amazing things by yourself. The potential of ai is just starting to get unlocked, and we're going to see lone wolf developers do incredible things in the coming years.


40s hacker here too. I felt that way until i discovered VR. I know its not popular on HN and many are skeptical but man I am loving it again. So done with stupid screen apps :)


That how I feel about AR and hololens, just picked up a dev kit. Hit me up if you want to collaborate on a holo app :)


I think we have endless possibilities but it seems nobody has time, the necessary resources, wisdom, and/or courage to try new paths that would compete with those 5 giants.


Those frontiers still exist. They're just moved to other technologies now. 3D printing, VR/AR, robotics, blockchains, IoT, etc.


I think the thing that really did it for me was seeing Windows 93 on the front page today. So much possibility and creativity lost.


As a counter point, perhaps, if that were true we should expect to see much less VC money being directed at tech startups.


Well, we're already seeing VCs diversifying into other sectors. Also, I'm inclined to see the huge amount of VC money as an indication for an overall shortage of investment opportunities.


Before the web and mobile people thought programming was done, e.g locked up on desktop by Microsoft. There's a vast unexplored space in AI/autonomous/AR/VR/sensor networks/IoT/smart things and I am sure I'm missing something.


I'm 40 and I feel this way now about cryptocurrencies


On the contrary, I see at least social as being open for distruption. FB did displace myspace after all.

Google, Apple, and Microsoft, though, seem pretty safe.


I think Apple jumped the shark quite a while ago. I would love to ditch my Macs, but the problem is that as much as Macs suck, everything else sucks more. If someone built something that Just Worked the way Macs did back in the Snow Leopard days I would replace all of my Macs immediately. And I think this is possible starting with Linux as a base. Linux has come a long, long way since 1993.

I just bought a couple of Raspberry Pis and I get the same exciting this-is-the-start-of-something-really-cool vibe from them as I did from my first Apple II back in 1980.


I jumped ship in snow leopard days when I saw what they were doing with their shared object formats (basically making my life difficult for no reason).

I've used Mint since then; never looked back. Mint on an old X220 thinkpad is better than anything mac has done since then. Requires a bit of overhead on setup to get power management right, but I paid that price years ago and can endlessly clone HD images.

Bonus is, most of my work product ends up on the EC2 anyway.


My problem is that I need to figure out a solution that works not just for me but also for my very-non-techie spouse. Mint might work for me, but I'm pretty sure it won't for her.


Mint+KDE is pretty good. Sadly the akonadi junk needs to be sabotaged.


I think iOS 11 is an example of Apple realizing that they need to take a step back and focus on the "it just works" aspect of their mission. There are dozens of small fixes that I've wanted for years that they've finally fixed, all in one release. Haven't had the chance to look at High Sierra, but I'm hoping it's the same there.


All of their products used to behave much more synergistically. There was a real feel when you bought an iPod and hooked it up to your iMac that the two popped together. Now, an iPad barely ever needs to even touch a Mac or even PC. The technology is at the point where it ought to be possible to offload display/UI tasks to an iPad - but you can't in an integrated way. They really need to tie their whole product lines together again (through synergistic features).


Not iPad as much as iPhone, but do you use Handoff? I do all the time, between work and home MBP, my phone and watch. For iMessage, web pages, emails, phone calls, etc.

I feel like the only thing we've actually lost was the requirement to use iTunes to set up a new iDevice, and good riddance to that. An iPod really was a peripheral whereas their modern successors are fully-fledged mobile computers.

If you mean use an iPad as an external display, that's a rather niche use. Is there a reason that the 3rd party app Duet Display doesn't meet your needs?


I'm 33, young-ish but I still have some nostalgia for the early days of the internet. I miss the lightness and innocence, when viruses had silly graphics and at worst screwed up one computer. When the biggest internet companies were still frivolous and playful and inconsequential.

Today tech has gotten heavy. Hackers work for organized crime and espionage, ruin lives, and cost the world countless billions. Tor and Bitcoin would have felt badass back in the day. Now they are used to dodge totalitarian regimes and to run black markets that are spreading untraceable opiates throughout middle America. And the iconic internet companies now shape the world, for better or worse.

Tech has arrived and I wouldn't roll back the clock, but I am sad that there's so little lightness left.


I'm considerably older, and I too have no nostalgia for the past, but I certainly have a broader view on many tech topics than my younger colleagues. A couple of days ago I was explaining to a couple how at one time the line printer was the primary output for users, no monitor of any kind. This made separate characters for line feed and carriage return important, in that you could program with them to create better (but still very primitive) graphics.

But today is certainly the golden age of software engineering. I'm still excited about many things, and learned about a significant new area of interest to me just a couple weeks ago.


I'm turning 30 this year. Not old at all, but just old enough to not be young.

I don't feel nostalgia, what I do feel is envy for the younger. Especially those younger than me that I feel are more successful than I. Not in the "Ugh, fuckers" way, but in the "God damn it if I knew then what I know now or at least listened to the people who did know, I could be that ultra successful young hotshot! I had all the opportunities and I didn't take them for various reasons. Grargh!"

Not sure if that's what feeling old feels like but I certainly feel a level of "Poop, there's def less time ahead of me for all the shit I want(ed) to do than there used to be"


>>Not in the "Ugh, fuckers" way, but in the "God damn it if I knew then what I know now or at least listened to the people who did know, I could be that ultra successful young hotshot! I had all the opportunities and I didn't take them for various reasons. Grargh!"

Eh, the types of success we envy is mostly a matter of luck. Being born to a family with means and connections, being good-looking, being at the right place at the right time, etc.

Obviously if you actually knew back then what you know today, you would buy lots of bitcoin and Tessa stock. But in terms of "soft" knowledge, I'd say you would probably end up in a similar place.


Maybe you're not old enough?

When you're 80 you may miss the heydays of your 40s when you were at the peak of your career (and possibly family life).

There are probably a lot of 30-somethings for which their teenage high-school years were the best years of their lives (I know several, I'm 34 myself and prefer my present to my past), so the right age for nostalgia is probably a very individual thing.


I think for people who don't stagnate and continue to progress throughout their lives, nostalgia will be limited. People who peaked in High School (for example) and stagnated in life afterwards will probably tend to look back on their past years with fondness.

I see this with people in the town that I grew up in who couldn't get out. After High School, they largely languished in the same town and haven't really done much with their lives since. I see them talking about past years and High School with far more regularity than the more successful people I know. The successful people on the other hand - they bring up the future more than they bring up the past. That's probably because they have something to look forward to on the trajectory they're on - it's a mental state, a mindset.

Now this is a very simplistic version of my observations but it's what I've seen so far. I'd love for people to bring in some kind of science to this theme that I've observed. My observations could certainly be isolated and wrong but from conversations I've had with other friends from similar smaller towns, they've reported observing similar things.


Your observations strike me as all too accurate due to my own anecdotal experience. I've tasted both sides of the coin. After leaving a phase of torpor that lasted years and started taking life into my own hands, my mindset completely changed to the point where my past self feels almost alien. The effect nostalgia had on my daily life essentially went from a driving force to a non-factor.


I would still add the disclaimer that you may simply not be old enough- after recent experiences with my elderly & currently very sick father.

When you get older your body wears out, even if you take care of it. You may not notice it as an active 40 year old but you definitely will as a 70 year old.

Other things like culture and values change around you and you can't always keep up, causing the feeling that "the world is going to hell". Maybe your field was doing great 20-30 years ago and is now on a downward trajectory (again you can't always plan for that decades in advance).

It doesn't have to be that way but getting old at some point does suck for most people, some earlier than others.


You're right, of course. The idea of slowly degenerating and becoming irrelevant like this has filled me with a sense of dread for a long time, although I've largely accepted the inevitable now.

I would say I'm attempting to get at least one decade of vaguely defined "intense" living. If I can get that it would already be more than what a lot of people get!


As a developer who is almost 60, and the father of two budding developers. I can't image a better time to EVER have lived.

The technological landscape is amazing, new things in tech, biotech, materials, and just human understanding of ourselves.

I don't feel old because I am excited about everyday and what adventures it will bring. I can still beat the twenty year olds in speed chess, though my twitch reactions are not fast enough to keep up in the modern FPS's. The trick is to challenge yourself everyday in mental, physical, and spiritual ways. Having great kids helps with that!


> "I can't image a better time to EVER have lived."

I dunno. The other day, my wife and our 5 children and I were all having a relaxing evening together as a family in the living room, and something my 4 year old daughter said accidentally set off Siri who said "sorry I didn't catch that" and my daughter said "never mind" and we all had a good laugh. Then my daughter said "I love you, Siri." My heart sank. I'm probably not giving enough context to explain it, but she seemed to genuinely think Siri was a real (and neglected) person in the family.


And in a few years Siri will probably be a sibling intelligence. I do believe our new computer overlords will be kind to us ;-)


Same here, same age range. I miss nothing about the 80s or 90s except not investing in the right skills or stocks earlier. Today is full of far more potential and choice, imo.


I'm about that age as well. I had a pretty rough entry into adulthood thanks to my school peers. Being a "geek" was tough in those decades. I harbor no nostalgia for that hot mess.

So there's one more reason a lot of older folks on here might not be quite as stuck on the past as most.

Watching my son sling 'duinos and rock NodeJS on his raspis and not get bullied for it is sweet. The golden age is now.


Yes! It's true for my kids too, although I'm guessing mine are a tad younger.

It's interesting to try to put a finger on when computers became "cool". I suspect it's only with the rise of the smartphone, as even video games were seen as a tad nerdy in the mid-2000s.


I'm only nostalgic for retro-technology because it's what got me into the field. I might boot up a copy of SIMH and run a DEC PDP-11 image of RSTS/E or the like just to see it again and wonder at the fact I have control over a simulation of a machine I lusted over and learned so much from. That usually lasts about 1/2 an hour and then I'm on to learning something new. The novelty and innovation is what makes this industry so much fun. I can't imagine ever growing tired of it.

In terms of "good old times" I just lament the fact kids today don't have the unstuctured freedom to explore the physical world like we did. People today (from my perspective) have very different perceived risk than older folks.


I agree with your last point--it's a real shame my kid will never have the experience of getting on her bicycle at 7AM, exploring a 10-20 mile radius all day without a call phone and GPS, and being trusted to return by sundown. I'd have no objection to allowing it, but every nosy ninny in the area would call 911 at the mere sight of an unaccompanied kid roaming the neighborhood.


I implore you to try the Microsoft Hololens. It changed my life, AR is the future!


No, it's you. I'm 43 and while I don't see myself as particularly nostalgic either, I have plenty of colleagues who are. I have no idea how we compare to general population.


I'm 55 and not nostalgic at all. Weirdly, I can hardly stand to listen to music of my teenage years because it puts me in a pretty sour mood -- and I was quite successful and happy as a teenager and in my early 20's.

Just yesterday I competed in the open division of a regional racquetball tournament and in a month in a half I'll compete in the open division in a regional powerlifting meet. The is not a humble-brag as much as to contrast this to the vast majority of the folks my age with whom I hang out.

Moreover, I'm heavily invested in growing intellectually which seems a contrast to many of my peers.

These folks have largely, though not all of them, adopted the "I'm too old" mentality to try to compete or grow in athletics or in intellectual pursuits.

The point is that this article, I believe, tends to comport with reality: many people begin to accept an easing of their personal standards and drive for growth because they're "too old" to not continue to push and compete.


At my first job, my boss, who was in his seventies at the time, told me that once people get to a certain age, societal expectations of sharpness decrease rapidly. According to him, this led to many people eagerly taking up the mindset you describe.

Maybe your peers never really had the personal standards you thought they had in the first place, and simply stopped trying once the peer pressure eased up and they got a good excuse to lay around?

This type of pressure is often met with scorn as something we are too good for. In my case however, it really helped me become a better version of myself over time.


I'm 55. I'm nostalgic, but also forward seeking. There were certainly things about the 70's and 90's that were better than now. Not sure I'd want to go back though. I'd rather change the future...


The best thing about 15 years ago was that my body was 15 years younger. Missing their youth probably has more to do with missing any particular specific cultural artifact of the time.


For me it's: I miss my youth and health, and I'd love to be able to go back just so I could have the time and energy to work on the things I'd like to work on now.

I am still making things, it's just more difficult now.

But video games are a hell of a lot better now than they were back then. Legend of Zelda: Breath of the Wild? The Witness? Life is Strange? Persona 5? So, so good. And there's nothing stopping me from playing the old games when I feel like it (except having the time).

Also I wouldn't mind having another replay of my high school and college years with the confidence that I have now. A lot of my experiences with women would have gone a lot more positively then if my attitude and approach then were the same as it is now (Well, in the past five years. I'm in a solid relationship now). I wouldn't want to replay the sitting around a classroom or schoolwork part of it though.


Your outlook will change in your 50s for sure, things will get punishingly clearer then.


I suspect the reason they felt better and more vital is that the change of mindset and environment altered their biochemistry.

How we feel and what we think of ourselves affects our levels of Testosterone, Cortisol, Serotonin, etc. Even a 5 minute conversation can give you a T boost of 30%+ ... or believing that you're perceived as high status alters your Serotonin. Those hormones in turn make you more vital.

So who knows what was the reason... maybe more social interaction with strangers? Or simply putting their mind into a different, better place?

http://www.ulm.edu/~palmer/TheBiochemistryofStatusandtheFunc...

http://www.cep.ucsb.edu/topics/courtship/roney%20et%20al_200...


> Even a 5 minute conversation can give you a T boost of 30%

How is it different from taking drugs, really?


The boost in testosterone may be one of a frozen naturally occurring reactions that have some interaction. Conversely, taking a testosterone injection is isolated, so it may have different effects on your body.


Not sure I understand what you mean. What kind of drugs?


What is the difference between taking artificial mood altering-drugs versus engaging in behaviours that result in the production of "natural" mood-altering drugs?


From this literal definition the entirety of our lives could be defined as simply "taking drugs". We chase the high by doing various things.

The difference between taking drugs and doing these things is that the reward mechanism is the product of many years of evolution, whereas drugs can sometimes produce harmful side effects due to not being fully fine tuned to our own bodies' expectations.

The distinction would go away entirely if drugs were more sophisticated. But in that case it would pretty much be the end of life as we've hitherto experienced it.


I don't think you're quite correct.

I think it's more that experiences as "full-spectrum" - they're occurring within multiple modes - while (most) drugs are single-spectrum. Which makes sense, most drugs are only trying to affect one thing - that's the goal.

If I jump out a plane, there' a lot more going on than the adrenaline spike. If I take an epipen, there's not much more going on than the adrenaline spike.


I think you're right. Which is actually better news since now the "solution" boils down to "Be mindful of your environment." That is, try to avoid stress, bullshit people, bad food, bad stimuli (i.e., yesterday's TV is not today's mass+social media), etc.


Well, there's way more to it... even our body posture affects our biochemistry and well being... there's this great video by Amy Cuddy:

https://www.ted.com/talks/amy_cuddy_your_body_language_shape...

Today, because of electronics we're forced into low power body postures that increase cortisol and lower testosterone levels, evidence suggests.

Then there's nutrition, sleep etc.

Couple days ago there's was a study posted here showing that people who exercise intensely are about 9 years biologically younger. We don't exactly know how exercise makes you younger but we do know it boosts your testosterone.

Btw. a great book on the topic: Winner Effect by Ian Robertson (founding Director of Trinity College Institute of Neuroscience) or Presence by Amy Cuddy


Cuddy's work on "power poses" has failed to replicate in larger and more rigorous studies [1] [2], and one of the earlier researchers of such effects considers them to have been an artifact of poor study design and analysis rather than a real effect [3].

[1] http://journals.sagepub.com/doi/10.1177/0956797614553946

[2] http://journals.sagepub.com/doi/10.1177/1948550616652209

[3] http://faculty.haas.berkeley.edu/dana_carney/pdf_My%20positi...


More bad science? Say it isn't so :) A couple weeks ago the BBC did an article on the fact that most "scientific" studies can't be reproduced by others. Makes ya wonder...


Wasn't is social sciences that couldn't be replicated?

Natural science wouldn't be able to advance unless studies could be replicated and subsequently improved on.


Not sure. Might be. But the gist was, lots of stuff gets published and is flawed. Maybe it's just me but too often it seems that correlation gets passed off as cause.


Great to know, thank you!


Yes. The point is, mitigate the "risks" of modern life and the unintended consequences (if you will) lessen.

The problem is, we want to do everything all wrong, and then wonder why we're getting the results we're getting, or not.

p.s. Thank for the book recs, but if you had to pick just one, which one?


Winner Effect... it's an excellent book.


It would be interesting to see how much of this is truly biological and how much of it is due to societal and situational conditioning.

There were lots of things I could do in my 20s (e.g. refuse to use gasoline-powered city transportation, refuse to patronize places that used disposable cutlery, refuse to use non-free software, etc.) that I can't do when I'm in my 30s because people around me would think I'm a stubborn idiot, jeopardizing my career at a point where I have not yet established myself. It's very easy to tell a colleague, advisor, anyone at school that you're going to bike to the destination or take electric-powered transit [because you don't believe in a fossil fuel future]. It's very difficult to say the same thing to an investor, co-founder, employee, customer, or whoever is offering you a ride in their car, without feeling like an ass. I'm basically forced to be "normal" during work times and fit into the mould of society. I can only be myself on evenings and weekends.

I can only imagine how much more "being normal" I need to do if I had kids, pets, tenants, or whatever. I don't have any of those at the moment. The other night I was pondering over potential improvements to our music and mathematical notation systems while staring at the Milky Way. (I didn't come to anything conclusive, but I love thinking outside the boxes that society defines for us.)

10 years ago, I could truly be myself 24 hours a day. I was basically learning all kinds of things about the world by doing that. Now, I only get about 5 hours a day to be myself. The rest of the time, I need to conform. The lack of "me" time itself may be contribute to some degree of mental rot/aging, apart from the biological component.


All the examples you mention are on the "futile" side of actions to take though. Like, if someone is driving a fossil car anyway, sharing a ride is very eco-friendly. There is very little extra CO2 emitted by you getting into the car.

However, you can buy an electric car yourself (and share rides with it). Noone would think you an ass for that. For those with a house, noone thinks you are an ass for installing solar panels on the roof. Same with not taking a plane on your next vacation, at least noone at work should get upset by that (and friends/family can hopefully more easily understand it).

On the disposable plastics side, going for the vegetarian option with disposable cutlery is orders of magnitude more eco-friendly than meat with reusable cutlery. And vegetarianism is somewhat socially accepted. While on the topic of symbolic actions, talking about it like a "50% meat reduction" that has an easier chance of spreading socially may do more good in this world than being strict about it -- better to have some influence and have 4 people reduce 50% than 1 person reduce 100%. Again, the option that rubs people less socially may be the rational optimum on the whole.

I.e., "conforming" doesn't mean you have to hide your concern for the environment, it just means it is better to focus on rational actions that actually make a difference (that others can easily understand), rather than symbol statements that have little actual effect. The latter is what makes one feel like an ass.


Oh sure. There are merits to all the things you are saying. There are also arguments against it (I don't believe in the "The car is going to be driving anyway, therefore you should use it" argument -- that's almost like saying "Trump is going to be elected, might as well vote for him anyway". You should vote for what you want.) But that's beside the point. And I am vegetarian by the way ;)

I'm not trying to argue the merits of one choice or another. I'm only saying that it's harder to make "wild and crazy" choices and follow through with them when you're stuck in this thing called society and aren't "the boss" in some capacity. You are part of various teams, which demand conformity in various levels. But to me it's precisely the "wild and crazy" lifestyle that taught me a LOT in my 20s.

Forget environmental topics for a second as it's too easy to get into a debate. Another example is that I questioned why we have a 7-day week. I worked a 10-day week for some time -- 7 days of work and a 3-day weekend. 3-day weekends enabled lots of fun trips that aren't possible in 2-day weekends, and 7 days of work is a nice productive sprint. It worked great when I could be "me". But now I need to go back to societal norms. With responsibilities come aging.


I don't mean to downplay what you are doing here but I have an observation: What you are doing will not change how things work because people will not change their ways.

Sure if everyone just did like you, we'll be in heaven right now. But my observation through first, second and third class countries have showed me that people are practically the same and will go the easy route. The easy route being whatever choice is given to them by their masters (parents, boss, government, etc...). If the choice is "good enough" they'll not complain and protest and continue to eat whatever "stuff" you feed them.

So does this mean you should give up? No, the opposite. You could, instead of trying small restrictions to your lifestyle, make big changes. Examples include Tesla. If you care about safe and clean life you should try to work on some of these companies and push them forward.


>It's very difficult to say the same thing to an investor, co-founder, employee, customer, or whoever is offering you a ride in their car, without feeling like an ass.

I would argue that being honest and explaining your true feelings and position on a certain subject to a person that you wish to hold a close relationship with is the best route imho.

They will respect you more for being honest, even if they disagree with you.

It's really not that weird to explain to somebody that you'd rather not trash the planet if you can avoid it when you think about it.

>I need to conform.

I sent a Ron Paul "It's Happening" GIF to my (rather staunch) clients the other day and they absolutely loved it. Which made me chuckle.


> It's really not that weird to explain to somebody that you'd rather not trash the planet if you can avoid it when you think about it.

I've tried. It's really, really, really hard. And the last thing you want is to make them feel like they have trashed the planet and you haven't. You'd rather they feel good about themselves (even if that means slightly trashing the planet yourself or whatever) so that they become your {customer, investor, business partner, employee, ...} and it usually ends up being that I have to sacrifice being myself in return for creating a business relationship as an investment for future success.

In school I would ditch organizations that regularly ordered food in disposable containers, and join others that I thought were less wasteful. I would actively choose to interact less with people who were too insistent on giving gasoline-based car rides instead of just respecting my wishes to use human-powered or electric transportation where available. I can't do that in industry without losing my reputation. Unless I'm Elon Musk, which I'm not.

I don't use daylight savings time (because I think it's stupid) and I don't use timezones (I do everything in UTC). I support metricization of the US, and use Celsius, kilometers, and kilograms when talking to people in my own time to cast my vote with my mouth. I type Dvorak, because I support things that I think are better. I constantly experiment with new frameworks of thought (e.g. a world where countries do not exist, and governments exert power over overlapping open sets of a topological space) and try to think and live inside these alternate worlds just to see what happens, what kind of contractions come up, and so forth. In high school I read many books, including Shakespeare plays, with the book turned upside down; I was about twice as slow, but I found myself using my brain differently, and I do learn a lot from doing experiments on myself like this. I can do these things with friends. I can't say that to a customer. Or an investor. Or an employee. When facing a customer, miles and pounds, and QWERTY it is. Pacific Time with daylight savings, and countries it is. Time is money. And thus, I become more and more "normal", and thus begins the aging and rotting of (at least) the creative half of the brain.

C'est la vie.


It's an effect probably at least as real as ESP: https://slate.com/health-and-science/2017/06/daryl-bem-prove...

Which is to say, I'm dubious as hell of this result: For something this click-baity, at this point in the history of psychology research, I'mma need some serious replication before I give itan ounce of belief.


In my 50s. Not exactly pickled in nostalgia.

I think the computing party is just getting started. Non-trivial domestic AI will be here within a couple of years, personal robotics 5-10 years after that.

The current ad mania sucks, but it's going to have to evolve or die.

I don't miss much of the past. Pocket phone computers, tablets, GPS, video calling, massive data storage, and the potential of renewables and distributed energy grids are all awesome. Like.

Even social has its moments.

The real problems are cultural and political. There's been some movement there, but not nearly enough. The system has nearly enough energy to go through a phase change soon, and that's when things will get really interesting.


AI; "nearly enough energy to go through a phase change soon"

I like the intriguing idea, that once we figure out AI, we could do some very powerful things even on today's low-end phones (10's of GFLOPS).

OTOH The recent "deep learning" breakthroughs have, have come from throwing resources at NNs. And it might be that we create AI long before we understand it and are able to do it efficiently.


I wonder if nostalgia is a human mind's hack to slow down aging.


I look forward to living to 100. But, 80 would be even better if only I could regain a 12-year-old's sense of the passage of time.


You can emulate a poor man's version of this by continuously trying new and challenging things.


According to my AncestryDNA test results, I don't have the centennial gene, so I doubt I'll make it past 90.


Kinda like a placebo effect, yes. It would be interesting to take a group of slightly younger test subjects and see what happens to them when they live with older people in the present.

Moi? The body and mind are both subject to: Use it or lose it. We also, as humans, tend to assimulate into the norm around us, be it smoking, obesity, and now I guess perhaps youth.

Finally, I have to wonder about the effects of essentially being on holiday. In addition, perhaps the group discussions energized them? That is instead of waiting to die, they had more reason to live? In any case, interesting.


http://www.sens.org -- after reading the article, still by far the most scientific and fully developed approach that I have seen.


Age may mean certain things about DNA methylation, but it doesn't mean you can't continue inventing, challenging yourself taking chances e.g. 94 year-old co-inventor of lithium batteries co-invents a solid state (solid-glass electrolytes) battery.

http://www.canadianmanufacturing.com/technology/94-year-old-...


Has anyone reproduced the results from this study of 49 people back in 1979?


Whatever your past, it is irrelevant now and future the only way forward, so smile and enjoy your ride together with the people you love (mid 40s here and still pushing, ehehe).


In a way neither the past nor the future exist. There is only the present.


While I certainly agree that western people live too much in their heads, ruminating on the past or worrying about the future, the past and the future certainly do 'exist', in that the effects of actions in the past become the cause of effects in the present or future.


Physical activity is the likely mediating mechanism between acting younger and gaining modest benefits by some measures. Since the development of lightweight accelerometers, studies of physical activity have demonstrated strong correlations between even modest activity of the housework/gardening variety and health in old age. There is a mountain of further research demonstrating the benefits of increased moderate exercise and lesser forms of activity in older people.

But ultimately the end is the same. You can't reliably exercise your way to 90, even. The majority of people who are exceptionally fit die before reaching that milepost in the environment of the last 90 years of medical technology. The future of health and longevity in later life will be increasingly determined by medical technology, and nothing else. Aging is damage, and that damage can be repaired given suitable biotechnologies to do so.

DNA methylation patterns correlating strongly with age are a very promising tool when it comes to assessing treatments for the processes of aging. Companies offer various implementations now - see Osiris Green for a cheaper example, to pick one. In the SENS view of aging as accumulated molecular damage, epigenetic changes are a reaction to that damage; a secondary or later process in aging. We'll find out over the next few years how the rejuvenation therapy of senescent cell clearance does against this measure, now that things are moving along there.

But you shouldn't think it impossible to construct useful metrics of biological age more simply. There are a number of excellent papers from the past few years in which researchers assemble weighted algorithms using bloodwork, grip strength, and other simple tests as a basis into something that nears the level of discrimination of the epigenetic clock.

When it comes to a biomarker of aging, there are lots of promising candidates. Researchers will spend a lot of time arguing before they come to any sort of pseudo-standard for that task. Industry (today meaning the companies developing senolytic therapies for the clinic) will overtake them and, I'd wager, adopt one of the epigenetic clocks because it basically works well enough to get along with, and can be cheap in some forms.


Not to be "that guy", but I will credit strength training in helping my father make it to age 92. What we found was that it increased his balance and stability. Of course muscles use the skeleton as levers, and there's studies that show that training will slow the reduction in bone density as you age. So dad's training likely prevented his breaking bones when he did take a tumble.

This fellow is involved with training those over 50.

https://www.youtube.com/watch?v=5DDGOXkpZxI


A man is as old as the woman he feels (G. Marx)


The experiment that kicks off the piece looks pretty replication-crisis worthy.


It's official, my boss really is feeding off my youth.


Some of the article was okay (although you can cherry-pick a lot to achieve a conclusion) but Langer's study in particular seemed very dubious. They "looked younger"? Stop - that's just way too objective for me!


>They "looked younger"? Stop - that's just way too objective for me!

What part of this kind of process:

they showed before and after pictures of them to third parties who didn't know them --nor did they know the order the pictures were taken-- and the majority identified the latter as younger looking

seems "too objective"?


Seems an odd takeaway after "memory, vision, hearing, and even physical strength had improved"


Considering the subject of the article, that's kind of an interesting most noteworthy (to you) takeaway.


> getting old

'aging' is the word you are searching for


Although the technically mean the same thing, the feelings they evoke are quite different


what is the difference?


One syllable.


I don't know, I think they mean the same thing, but "getting old" has the connotation that it's towards the end of your life. That is, a baby is constantly aging into a slightly older baby, but it's not getting old yet. A thirty year old ages into a thirty one year old, but unless you're a teenager you wouldn't generally say they're getting old yet. But someone who's retiring and moving to florida? They're getting old.

Of course you might also say "you're getting old!" to say a fourteen year old cousin that you haven't seen in a few years. It depends on context. But I don't think they literally contain the same information except for syllables.

So I think jlduggers pedantry is misguided in this case.


getting old is binary, aging is continuous.


Constant change does harm. That's my takeaway.


Curious how you got that out of the article? I didn't see anything in it that seemed to suggest as such.


They seem to have been stuck in a year. I only briefly skimmed it though.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: