Hacker Newsnew | past | comments | ask | show | jobs | submit | more zidel's commentslogin

The requirement is for the government to produce an impact statement (including carbon emissions in foreign countries and emissions from e.g. combustion) as part of the plan for development and operations that is presented to parliament for approval. This follows from the Supreme Court decision in 2020.

There is also a temporary injunction on further developments and decisions related to these three oil fields until the validity of the plan has been decided.

The District Court ruling was appealed by the government yesterday.

Edit:

Machine translated judgement: https://www.greenpeace.org/static/planet4-sweden-stateless/2...

Court documents (mostly in Norwegian, some witness presentations in English): https://www.greenpeace.org/norway/dokumenter-fra-oslo-tingre...


One of the richest countries is considering leaving oil, and by association, money in the ground.

If only they could make a deal with Canada to swap these oil fields for leaving tar sands in the ground, that would be even better.

Lots of countries are going to have to make this selfless decision if we're going to get a handle on climate change. I don't want to guess what the odds of that are.

Technology is probably still our best hope.


It's bullshit. Us(Norway) not pumping oil doesn't change anything. The problem isn't pumping, the problem is burning. We need to reduce the demand, not the supply. Reducing supply from Norway just causes other suppliers to scale up and the end result is exactly the same. Until they run out and we pump up this oil anyway because the world needs oil and nobody's addressing that part of the problem.


Your argument doesn't make any sense. If they pump it out of the ground, someone WILL burn it. If they don't, it can't be burned.

Also: less oil -> higher price -> less burning


You don't understand my argument. If they don't pump it, someone else will pump more to compensate and we'll end up pumping it up later anyway.

I don't think there's going to be transport ships docked for lack of fuel just because Norway doesn't develop some oil fields. And I think once supply some day gets to a point where oil price skyrockets, we'll probably end up developing them then - they'll revisit the case and circumstances changed (or maybe just politicians changed) and they'll suck it up anyway.

Or maybe US/Russia/whoever will come liberate it. I mean I doubt that's happening any time soon but it's definitely a real possibility in a future scenario where resources are dwindling.


> If you don't pump it, we'll end up pumping it up later anyway.

Well just give up now then.

Let's assume we can have enough sense to decide not to pump it and leave it that way. Then it's a win.


Yeah but pointless performative self sacrifice is kind of our thing.


> Does this also share back DHT information like a "server"?

No, it ignores all incoming requests. You don't need special software to help out, just run any normal Bittorrent client in the background (no need to download or share anything) and it will help out. Just make sure you forward the right port if you are behind a NAT. Traffic will slowly increase over time, and drop quickly when offline, so leaving it up for days at a time is better when possible.


A 2022 study experimentally found the limit to be 30.5C ± 1C at 100% humidity, and lower in dry environments.

https://journals.physiology.org/doi/full/10.1152/japplphysio...


> Car commuters miss commuting the least: 55% of this group do not miss it at all. Commuters by (e-)bicycle are the group who miss commuting the most, with 91% missing at least some aspects of commuting. As might be expected, the feeling of missing commuting also decreases with increasing commute duration (Fig. 5). > The connection between work and commuting is evident in the relationship between missing commuting and the intention to work from home in the future (Fig. 7). Among those who do not miss the commute at all, 72% express desire to work more from home in the future. Among those who miss commuting a lot, 69% would like to go back to their previous work routine.

https://urbanstudies.uva.nl/content/blog-series/covid-19-pan...

This matches the numbers that have been reported in Norway as well, where the percentage that miss their commute are roughly 80% for pedestrians/cyclists, 50% for public transport and 30% for driving (https://www.nrk.no/norge/blir-palagt-hjemmekontor-_-mange-sa... in Norwegian)


Never thought about it this way, but my bicycle commute has always been 30-45 minutes and that's why I'm leaning 60-70% towards "I don't miss it". If it was only 20mins door to door I guess that would be different.


I'm surprised cars have it worse than public transport tbh. I guess it really depends on parking, etc.


For information specific to Norway (in Norwegian) there is a lot here:

https://trv.banenor.no/wiki/Forside

https://www.jernbanekompetanse.no/wiki/Forside



I think BitTorrent has all the pieces needed for a fully distributed version of your idea. My initial thought is that you could publish a magnet link that points to a mutable DHT item, which in turn points to a torrent that has a JSON file with some metadata and a list of infohashes the publisher cares about. The client could then scrape the "leaf" torrents from multiple lists to get the peer counts and use that for local prioritization of what to store. By reusing existing torrents you could then share resources with standard torrent clients that are unaware of your system.

The list idea could be extended to nested lists (stavros recommends Internet Archive) for discoverability and composition.

If you go with v2 or hybrid torrents from the beginning you could deduplicate and cross seed files from different collections.

The lists could also be modified to have torrents to exclude, possibly using some salt + rehash idea to make it hard to reverse into a list of e.g. CSAM you don't want to publish as is.

Feels like a neat project that could interoperate nicely with existing torrents.


Thanks, that's exactly the feedback I was looking for! This sounds like it would work, though I'd have to see if it would scale to thousands or millions of files. Still, great for a PoC, thank you!



I've played around a bit with DHT indexing recently and a very simple python program using libtorrent to send sample_infohashes (BEP51) and download metadata (to get names/files) was enough to get me 1-2 .torrent files per second without any special effort or aggressive settings. The bottleneck (by 10x) has been the embarrassingly parallel info hash to .torrent step, so speeding things up shouldn't be very hard.

After running it sporadically for a few months I ended up with 1.4M torrent names and 30M info hashes, but I never put any work into estimating the size of the DHT, so I don't know what sort of coverage that represents.


It looks like the decision includes scope 3 emissions, so in your example the refining would be included, as well as the emissions from burning the petrol


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: