In my field, we're always wanting to see what will happen when DNA is changed in a human pancreatic beta cell. We kind of have a protocol for producing things that look like human pancreatic beta cells from human stem cells, but we're not really sure that they are really going to behave like real human pancreatic beta cells for any particular DNA change, and we have examples of cases where they definitely do not behave the same.
Yeah, I currently have a VPS with various SSH port forwards allowing me to direct incoming connections of various types to my home computer which is behind NAT. It's evil and horrible and nasty for various reasons, not least of which that all your incoming connections look to your inner server like they come from the same IP address, preventing you from logging or filtering the source of any request. And you need to make sure if you forward incoming connections to your SMTP server that it doesn't think they are local trusted connections that it can relay onwards, turning your setup into an open relay.
Seriously thinking about switching to a setup similar to the article. I mean, my setup works for now, but it's un-pretty.
They're running trains. Trains use a lot of electricity, and they turn almost all of it into heat. You'd have to have as much chilling capacity as the current electricity demand of the entire tube line, which is quite a lot.
However, if the buildings above were to sink ground source heat pump loops into the warmed ground to heat the buildings in winter, this would basically be what you just suggested, and would be a win-win situation.
Huh? Modern modular air cooled chillers go up to 800 tons each and can remove multiple MW of heat load continuously pretty much 24/7.
500 of them could remove 1.4 GW of heat.
Of course there are many ways to improve efficiency, but even assuming the worst case it’s still technically feasible to remove many times more heat than the line generates.
And where is the energy to power all of them coming from?
GP is saying that you can approximate the energy going into the system by looking at the electricity consumption of the trains, as all then energy is eventually going to end up as heat.
A heat pump can have a CoP topping out at 5. So 1 unit of energy needed to move 5 units heat out. That means a “net zero” cooling system would consume a minimum of 20% as much energy as the trains themselves. Realistically it’s probably closer to a CoP of 3.5, so 28% more energy. For something like the underground that gonna be a 5-10% increase in there operational costs at a minimum. Where does the funding for all that come from? And that before we even look at the capital costs of heat pumps and various ancillary equipment needed to run them.
As a point of reference TfL underground trains have an average power consumption of 140MW continuous. Now only about 45% of the underground is actually underground, but that’s still 63MW in just the underground parts. At an optimistic CoP of 5, that means 12.6MW of additional energy needed to cool the tunnels using your approach.
Wholesale electricity prices in the UK are something like 7p per kWh. So over a year that’s an additional £17m of electricity, just for cooling.
That's £17M of electricity, not including all the equipment, staff, installation costs etc.
To put this in perspective, Sadiq Khan already runs the system at a massive loss. They just about cover their operating costs at the momemt, and rely entirely on grants from the rest of the country to do upgrades of any kind. So cooling efforts have to be very cost efficient. Also the UK grid is very supply constrained. New nukes are being built but there have been the expected massive cost overruns and problems, so any large new energy demands in the UK just aren't happening anytime soon. It's actually been deindustrializing due to very high electricity costs.
Yes doing things at scale does cost lots of money. That often why doing the “obvious” is the wrong approach. When you’re operating at that kind of scale, small savings are still substantial, and scale often makes more innovative, less obvious solutions a better pick. Especially given the cost of exploring those options is so cheap compared to total cost of the project.
Also the underground is funded pretty much entirely by fares paid. Past UK governments have cut any tax payer subsidies for TfL to zero for day-to-day operational costs, and there’s zero indication that’s going to change any time soon.
I’m pointing out how shallow and unhelpful your comment is. I provided an analysis as to why a seemingly obvious solution has serious problems, you made an unhelpful comment that seems to suggest that commenting on the cost of a project means I’m confused about the political acceptability of a project, rather than the fact I’m trying to demonstrate why the obvious solution might have some problems. Which has nothing to political acceptability, but is simply a comment on the poor value-for-money that specific solution represents.
Me too. I was hoping for an explanation of why the software I have got used to and works very well and isn't broken keeps being removed from Debian in the next version because it is "unmaintained".
Absolutely. A straw bale hanging from a bridge over a river is an anachronism. It doesn't normally belong there, and it's more likely to be noticed than a literal notice.
Thankfully the title is slightly clickbaity. The research isn't actually being deleted at source - it is just no longer being offered by these libraries.
I disagree. Extensive compilations of other scientific research are research themselves.
I can't help but feel like this is the start of a modern equivalent of what happened to the Institut für Sexualwissenschaf, now masked by procedure and law.
Although since the GNSS satellites use directional antennae pointing at Earth, this experiment only picked up signals where the satellites are on the other side of Earth and close enough to its edge for some of that directional signal to leak past the edge of Earth and get to the Moon. So, the satellites that are nearly 4° away from the centre of Earth cannot be detected because they are beaming their signal nowhere near the Moon, and the detectable angular size is much less than 7.896°.
Pumped hydro is a form of gravity battery. It doesn't have great energy density, but it has fantastic power density and responsiveness. That's where its strength lies. We have also probably already built most of the ones that could possibly be built.
There are 86 large pumped storage sites in the world. It does work, but you need the right geography.
You need two good reservoir sites at considerably different levels close to one another. That's somewhat hard to find.
I remember the lecturer commenting on what sort of sick and twisted mind could come up with such a ridiculous convoluted notion when I was taught it at university.
Wheeler was also one of the inventors of the "closed subroutine" AKA function, which had to be implemented via a hack as machines of the time did not include ISA support for "return":
And you can do tricks such as lucky imaging or active optics (depending on your budget) to further improve the resulting resolution. Lucky imaging is tricky on something as dim as Andromeda, but has been shown to be just about possible.
I haven't seen lucky imaging used on dim objects by anyone I know. I personally do not have a large enough aperture to collect enough light for that. But I've used it on bright planets before via AutoStakkert[1]: https://www.astrobin.com/full/06dzki/0/
Lucky imaging was always a tool for use on planets and the moon. Anything bright.
It's hard to do dim objects because there's less for the software to inspect in each frame to determine the luckiness and distortion, but you can maybe use fortuitous bright stars in the frame to index off. You also need to collect a huge number of images to get any sort of signal to noise ratio. This video is an example of the technique actually used on a dim object, though the results were fairly modest because of murky British skies.
reply