Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The man who brought us the lithium-ion battery has an idea for a new one (qz.com)
229 points by mocy on Feb 5, 2015 | hide | past | favorite | 86 comments


I remember looking at his work when I was a grad. student. Some of it was amazing. For example, I remember once he set up an apparatus to grow crystals using chemical vapor transport. To know when it was finished, he sealed wires to inside the quartz tube (as an open circuit) and connected them to a battery and a light bulb. When the crystal (which was conducting) grew large enough between the leads, the light bulb would turn on. It was elegant and simple. (Usually this type of synthesis occurs in a furnace where you can't tell the progress and just have to guess and take it out). We still refer to the "Goodenough-Kannamori" rules of thumb when we're trying to use heuristics to guess how an oxide will order magnetically. He's had a truly impressive career...


Hopefully his career will continue and he comes up with a new breakthrough battery.


tl;dr the path he has chosen involves one of the toughest problems in battery science, which is how to make an anode out of pure lithium or sodium metal. If it can be done, the resulting battery would have 60% more energy than current lithium-ion cells. That would instantly catapult electric cars into a new head-to-head race with combustion. Over the years, numerous scientists have tried and failed—it was lithium metal, for instance, that kept setting Stan Whittingham’s lab on fire at Exxon in the 1970s.

It's a very interesting profile/historical review of this great scientist and his work, but if you were curious about the actual headline, the lede was buried almost at the end.


Yeah, the headline was almost an afterthought. I guess he isn't really talking and they needed something to give people a reason to click and read. Gave me enough reason to read about him though, and I definitely don't regret it.


What I'm worried about is the waste of lithium and other battery/accumulator metals. In theory, people should turn electronic devices and batteries to recycling facilities, but a large part of the population ignores the rules and throws their gadgets into the trash when broken...

I wonder if someday we will find a way to "separate" trash on atomar level (i.e. put arbitrary stuff in on one side, get raw atoms on the other side)...


One solution is to collect a payment when you buy a new gadget, then you get it back when you return it for recycling.

At the moment recycling technology can already grind stuff into small particles and float it in salts of different densities to separate it. That doesn't go to atomic level of course...

I think it's a fundamental outcome of the laws of entropy that separating always takes a lot more energy than mixing, so if you've mixed it, you've lost the game already.


We do this in Finland for bottles and cans. Giving people the financial incentive to recycle has two effects: - For those who want to keep their money they take it back - If they don't care enough, they leave it somewhere and there are many "recyclers" who are happy to pick up your cans and bottles and get the money for them.


Yeah, we do that in most US states too. At our dump (disposal area) there is a station to donate your bottles and cans to local charities and schools.


> One solution is to collect a payment when you buy a new gadget, then you get it back when you return it for recycling.

that's an excellent idea.


This is a good idea - and what they already do in the auto industry for parts that can be re-machined and sold again.


NO! I own the battery. I can do what I like with it. I'm not paying for this. Ugh.


You would still own the battery and you can still do what you like with it.


Todays landfills are tomorrows mines.


And archaeological dig sites.


In some areas if I recall there are companies doing such. Landfill mining and reclamation (LFMR)


razster, you seem to be hell-banned.


If batteries really are used in cars, they will be, by far, the largest source of bad batteries. And it's pretty tough to just throw away a 800-lb battery. It's the same reason that old car engines are recycled, not trashed.

Right now Nissan will replace the battery in your Leaf. But they _require_ the old one since its value is built into the cost of the replacement.


Where I live, at least, I'd have to make a special trip between 9-5 on a weekday to a battery recycling facility. And when I'm free on a weekday I'm not spending that time schlepping batteries across town. If they included batteries in curbside recycling pickup, you'd get the same high rates you do with paper, plastic, glass, and metal.


There is a Penn and Teller skit where they go to people's home's and ask about curb side recycling. They ask the home owners if they'd be willing to have a x colour bin for y. They continue on and on and on until the home owners have 15 bins out front their house. Although most of the home owners agree to have that many bins, I think their point was to say, we can't have a rainbow of bins out front our homes.


Then take them with the plastic/metal/glass, which needs to be sorted anyway, and sort out the batteries when you do the rest. IIRC the first 90% of the sorting is done by density anyway, and I'm sure that technique would be pretty effective at separating batteries (very dense) from milk cartons, tin cans, and takeout containers (less so).


My family in The Netherlands has 3 bins.

Paper, plastic and other waste, and greens (compost).

They also get charged per pickup, so it is in your best interest to separate as much out as possible/flatten it, and make as little waste as possible in the first place.


In NL, this is different per municipality, btw. Although I have heard that where I live, garbage is not separated, it would cost more energy for the separate (garbage pickup, etc) transport chains than it would cost to separate it at the dump site.

Also, regarding batteries, almost all supermarkets (which are generally within walking distance in NL) have a disposal bin for batteries.


Ottawa has four: Paper, plastic and glass, compost, and other waste. We also have days for yard clippings, christmas trees, and old mattresses, as well as "give away" days for when something isn't garbage.


in denmark you just put a clear plastic bag of batteries on top of your bin. You really don't accumulate enough batteries to justify a whole bin for each house.


In the United States when you buy a new lead-acid car battery, they will charge you $10-$15 extra unless you turn in your old one (this is referred to as a "core charge"). The infrastructure already exists at parts stores to recycle these type of batteries, I'm sure it could be extended to li-ion ones as well, If they made their way into cars on a large scale.


It is a big issue, but these facilities are becoming more common. One trick that works for me: put used batteries inside an empty bottle when the bottle is full take that to the facility.


Unless you live in San Francisco, municipal battery recycling programs consist of transporting batteries to the landfill for you.


What ever happened to the trash-to-Syngas plant supposedly operating in New Jersey? How would that handle lithium batteries?


I assume they already separate out valuable materials like aluminum.


Well that would require a lot of energy !


Not necessarily. A lot of our trash is in complex molecules. If the binding energy of those molecules is higher than the binding energy of the simpler elements/compounds that we get out then, in theory, the process would also be energy positive.


In theory it may not be that much. There's no reason you can't reuse the same few J to split several molecules, and capture it again for the next batch.

One probably can't get a resersible process, but just adding up the energy to break all the trash can be wrong by several orders of magnitude.


Achieving nuclear fusion should provide us with the energy.


Yeah that was what I was about to say, fusion, solar, and other centralized energy tech (if we every readily deploy them) should allow us to trade more energy for more of the materials we need, and then as other commentators have said, the trash piles of the world start to look like just a bunch of raw materials.


> But Goodenough is equally dismissive of such tinkering and its measly 7% or 8% a year in added efficiency.

7% extra efficiency per year means electric cars will have twice the capacity in 10 years...and then double up again in 10 more years...and so on. That's a far greater rate of improvement than for gasoline-powered cars, even if we're impatient and we want our $10,000 500 mile on a charge EVs now.

> But the path he has chosen involves one of the toughest problems in battery science, which is how to make an anode out of pure lithium or sodium metal. If it can be done, the resulting battery would have 60% more energy than current lithium-ion cells.

I don't know how "real" its technology is, but SolidEnergy promises a 50 percent increase in energy density using an "ultra-thin metal anode". The company promises commercialization for phones in 2016 and for EVs in 2017.

http://www.solidenergysystems.com/technology.html


> I don't know how "real" its technology is

He's got an amazing track record.

> but SolidEnergy promises a 50 percent increase in energy density using an "ultra-thin metal anode". The company promises commercialization for phones in 2016 and for EVs in 2017.

The problem with battery tech is that it is a bit like solar: every year there are 10's of announcements like that, usually not from parties that have had such breakthroughs in the past. As is noted in the article since better battery tech is such an absolutely incredible breakthrough there is no shortage of those that would use this to their benefit absent any actual science.

So for battery and solar breakthroughs my personal hurdle is 'show me', until then I will happily wait by the sidelines (not as if it would matter in anyway if I didn't). But I'll give this particular scientist a break and I'll say that I will be much less surprised if he's going to the the one to nail the next substantial increase in battery efficiency. I hope the result will be as safe to use as Li-Ion or LiPo. We're getting to the point where chemistry of explosives and chemistry of batteries is quite comparable and that's one reason why this is such a hard problem. Making a better stable battery is the hard challenge. And LiPo is pushing it there, those are not batteries I'd want on my person or in a spot where they can cause a lot of damage when they go.


> 7% extra efficiency per year means electric cars will have twice the capacity in 10 years...and then double up again in 10 more years...and so on.

An example of the dangers of extrapolating.

CPU speeds kept going up - until they didn't.

Battery tech is not going to keep going up 7% each year. Although perhaps his 60% improvement will show exactly at the right time to effectively be 7% better than the previous year.


We are actually not so far away from hitting the ceiling with batteries. There are fundamental physical reasons for why you can't put much more than 1eV per atom in a battery. That equates roughly to 850 Wh/kg, if memory serves, and we are currently pushing Li-ion towards 300 Wh/kg. So we can only keep growing at 5% for another couple of decades or so. Regardless of battery chemistry.

What people often do to hide this fact is talk about energy density, Wh/L, where growth can continue for longer, at the expense of making batteries heavier.


Gasoline manages 12kWh/kg, and I'm sure I've heard people talking about batteries at least approaching that density before, even if only on a theoretical basis.

Leaving aside how safe I'd feel with a supercap of that density under the hood, are you sure about the 800Wh/kg limit? I'd be very interested in seeing a reference.


The 1eV per atom is really a rule of thumb, but easy to understand: At that value you are getting dangerously close to the ionization energy 13.6 eV for hydrogen, 5.4 eV for lithium), so you can no longer have a chemical battery. Note also that this is average eV for all atoms in the battery, so those actually providing electricity will be carrying atound 1.5 - 2 eV.

See e.g. here: http://www.ohio.edu/people/piccard/radnotes/radioactive.html


Found this article discussing some limits http://chargedevs.com/features/three-of-a-kind-polyplus-reac...


CPU speeds stopped increasing because the market stopped focusing on CPU speed (coincidentally to this discussion, the change was largely to reduce power demands). Transistor count has kept plugging right along;

http://i.imgur.com/FvVrnn4.png


I think it's the other way? That is, market stopped focusing on CPU speed because they hit the wall and clock speed could no longer be used as a differentiating factor.


Yes, market would be more than happy to have 100GHz CPUs by now, it would make scaling everything much easier.


I think it is misleading to say it was due to reduce power demands. Rather it was due to needing to cool the damned things. If you were to have a 500W TDP chip, you would either need a crazy huge cooling system or a much bigger surface area. Consumers want light and thin, so Bob's your uncle.


Even if consumers were fine with big heavy things (and except for maybe phones, I don't think the CPU weight/size are an issue for consumers), light travels roughly 1ft/ns. You want fast, you have to be small.


Surprised by the negative reaction.. More technically, processor speeds required far more electricity to increase clockspeed which required much more energy to disappate in the form of fans and heatsinks.


SolidEnergy also looks like it has a solid team behind it as well.

http://www.solidenergysystems.com/people.html


Having done a little science myself, usually there both are small incremental improvements with small but predictable gains and large leaps with big potential gains but very unpredictable outcomes. The former are optimizations around a current local optimum, and the latter try to escape it towards a better local optimum. For that reason the incremental gains tend to saturate over time, and I suspect that is what Goodenough is dismissing.


A note for those who go looking for the technical details: there aren't many here. According to the article, Dr. Goodenough is being tight-lipped about his work.


I expect a note in the margin of one of his journals, discovered posthumously, "What a simple and elegant anode, the margin is too small to contain it."

I get that he is angry, but I worry that his anger may rob us of what he could do.


At the same time, he knows the power of copious notes proving the origination of the idea, so hopefully it won't be that bad.


Given what happened before you really can't fault him.


How much would he have gotten in royalties anyway? I don't think there are any billionaires who got there just through patent royalties, right?


It's rare for individuals, because a patent costs ~$25k to file and maintain, and the process takes ~5 years. Patents are narrow and taken out very early in the inventive process - so prone to being irrelevant or silly. The really valuable patents - fundamental ones - are particularly risky. This adds up to meaning that the mass insurance of corporate sponsorship is the only way for inventors to profit (marginally, and as a group).


The inventor of the B&D Workmate has earned a 3% royalty on 55 million sales. That's almost certainly over $100m. It's very rare to make a lot of money on a patent, but for something popular let alone ubiquitous it's entirely possible.


> Without it, we would not have smartphones, tablets or laptops, including the device you are reading at this very moment.

Yes we would. All these things existed without Li-ion. The power/space budget would be more constrained but it was perfectly achievable. What wouldn't work is senselessly burning cycles running managed code in a VM with heaps of battery sucking DRAM. People today have no concept of how much computing power is wasted as excess heat because of modern software development practices.


> People today have no concept of how much computing power is wasted as excess heat because of modern software development practices.

Or of how many new ideas are created because generous power/space ratios give us the luxury to develop quickly using abstractions higher than highly-optimized references and pointers. You can take a hit on optimal energy efficiency if it results in a larger ecosystem with more possibilities.

The real challenge is to get both.


But high level languages do not require VMs. Time for a rant.

We were lead down a collective rabbit hole by rabid VM and JIT enthusiasts. Notice the absence of those extolling the benefits of JIT compilation today. People used to claim that at some point Java would consistently outperform C due to the greater number of optimisations available to a JIT compiler. They've all become silent because it's quite obviously a load of baloney.

The irony is that we had the answer all the time, but nobody wanted to believe it.

Back in the day, languages like Haskell, early C++, early Objective-C and Scheme compiled to C. Compiling to C was great! No need to spend man-centuries building an optimising cross-compiler to compete with GCC (and fail), no need to screw around with GIMPLE.

In 2015, compiling to a mid-level language is back in style. Except we don't call it C, we call it LLVM-IR. If you squint it's the exact same thing we were doing 15+ years ago.

LLVM-based languages that look like Rust and Swift will eventually dominate, because they are universal. You can use Rust for the lowest level embedded programming. You can use Swift for the highest level architecture astronautics.

What is depressing is that we could have built them 15 years ago but we were too busy fannying around with a dead-end technology.


Also code produced by LLVM-based languages can be linked with low-level modules written in pure C/C++, or even assembly, if necessary. So one may quickly prototype and then slowly rewrite the software for performance, module by module. No such thing is possible for JIT-based languages or platforms.


As an embedded engineer, you and parent have put into words something that I've known but haven't been able to articulate.

I think I encountered the sentiment (on HN) before that because CPU and memory speeds have become so far out of whack and that increasing CPU speeds have power limits that engineers will start having to worry about optimization more and more. I suppose the difference is that before, we didn't have the boat-loads of memory that we have now (in a system).

In other words, yay for LLVM IR!


But I think it is energy efficient already. The modern smartphone replaces the radio, tv, newspaper, store, school, workplace, you don't need a clock, a walkman or a map any more. All o these consumed energy or cost money to buy and are heavy to carry around.


Rubbish! The CPU isn't the power hog that you think it is in smartphones, nor are the programming techniques. Things like the screen and the radio consume most of the power.


For most use-cases, yes, but there are some CPU-intensive (usually graphic-intensive, so maybe GPU-intensive) processes that will eat up your phone's battery life like crazy. I mostly find this in games--Ingress is the worst offender I've dealt with, but Kingdom Rush (which uses no GPS or radio) drained battery much more quickly than just having my screen on for, e.g., reading.


I still fondly remember how long my original Newton MP100 lasted on 4 AAA cells... An order of magnitude longer than my iPhone6 lasts on a charge.


And the original Macintosh Portable could get 12 hours on a charge… of its 2lb lead-acid battery.


We have devices nowadays that put effort into battery economy high up their list of priorities. Kindles are an example. Their common features: they tend to be single-purpose devices, they tend to have crappy displays like e-ink or black-and-grey LCD, and to the extent they are general-purpose, they aren't effective or much fun.

So they would never have been tablets or laptops. They would have stayed as e-readers and PDAs and so forth.


> Their common features: they tend to be single-purpose devices, they tend to have crappy displays like e-ink or black-and-grey LCD, and to the extent they are general-purpose, they aren't effective or much fun.

I like the fact that Kindles have e-ink and are single purpose. When I read on a backlit screen, it tires my eyes and its easy to get distracted from your book with SMS etc. E-ink is perfect for me, for distraction-free reading.


If it was possible to use a lot less power by programming that way without any other tradeoffs there would probably be a market for a device with a much longer battery life.


I agree, and hope new high-level but efficient languages like Rust could help here.


Kind of off-topic, and I'm sure I'm not the only one thinking this, but is it common to have clarity of mind at age 92? Is it mostly genetic or is it more like "keep challenging your mind and get enough sleep"? Anyone have good links?

At 29, I'm (probably prematurely) worried about cognitive decline. I only started challenging really myself last year.


(1.) The plasticity of the brain changes, which means that it takes more time to learn new things and change your mind. Many other abilities remain at a high level.

(2.) You may look up a chart that plots dementia against age. Prevalence of dementia increases at an accelerating rate as you get older.


I think one thing you can do is try to avoid a typical fallacy. My brother gave me a great example:

- When people get older (say around 30-40), they realize that they can no longer memorize stuff quite as good as in their twenties

- They start to write notes a lot more in order to make sure they don't forget stuff

- Now they get even worse at memorizing stuff, because they challenge their memory a lot less


There is some brain plasticity research that seems to indicate that the more you challenge your brain with new activities and problem sets, the more resilient it is against decline in old age.

Do you do a lot with your mind during the day? Take up wood working, knitting, or sewing as a hobby. Work with your hands.

Do a lot with your hands during the day? Do something that engages your mind more. Take up art, writing, etc.

Take dance lessons. If you are not a dancer already, learning to dance and then doing it has a lot of health benefits, social benefits (see below) and makes your mind work in different ways.

I highly recommend trying square dancing (as corny as it may sound). Square dancing is very intricate and requires careful attention to called queues in order to maintain a steady stream of dance transitions. It is the fastest way to get into "flow" that I have ever experienced and have heard of number of other writers and technical types that rely on flow express the same sentiment.

Learn, teach, and play more board games. Modern board games have a vast array of different mechanics and strategies in them offering lots of different problem spaces for your brain to tackle. There is also a good social aspect to them that is inter-generational (we have board game nights with ten year old kids regularly playing with retirees) and some decline in cognitive ability is linked to the lack of social engagement people experience as they age.


Avoid vascular disease to avoid (most) dementias. Much of the rest appears to be up to zest for life Source: me (MD)


> "Goodenough in his lab at the University of Austin."

There is no University of Austin, it's the University of Texas at Austin, or "UT Austin" - same as there is no "University of Berkeley", but instead, "Cal". ;)


University of Austin sounds like a Portlandia-like parody of hippies and alt-culture types putting up classes in old abandoned buildings and alleyways and living in fear of being found out by UT security.


You could always throw back to "Road Trip" with Tom Greene:

  "Austin, not Boston"
;)


The QZ article gets much of its information from this March, 2001 oral history interview:

http://authors.library.caltech.edu/5456/1/hrst.mit.edu/hrs/m...

His recollections are very detailed, and you can clearly see what a genius John Goodenough is. Many early computing and defense industry interviews have a similar feel as this one -- so wonderful to read and be inspired by material like this.


Wish society would incentivize more smart people to tackle these important problems rather than go into finance, management consulting, working at the next hot tech startup, etc.


How would this work in practice?

Maybe some form of corporate patronage, like Facebook and Stripe did in this story, and giving the inventor a generous slice of the returns from the invention (I can imagine several downsides to this too, of course):

https://news.ycombinator.com/item?id=9003791


The problem is that investments in the future and public goods will always be unfunded in a free market capitalist economy due mainly to the free-rider problem. Fixing it would have to involve the government (or a government-like entity) subsidizing these projects. Of course care would have to be taken to make sure this money is properly allocated, the incentives are properly aligned, etc.


I cannot decide if that Soundcloud laugh was completely distracting, or whether it improved the storytelling.


Completely improved. That's what we want our Mad Scientists to sound like, dammit.


But.. the one we had was Goodenough.. yeahhhhh


tl;dr version:

> Although Goodenough will not spell out his precise new idea, he thinks he is on to something




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: