Hacker News new | past | comments | ask | show | jobs | submit | wronex's comments login

Some countries (Poland?) has experimented with banning advertising in public spaces. Think bill boards. This has lead to very clean and good looking cities. I don’t think the it’s unreasonable to ban ads in other places too.

Here is the thread that likely sparked his resignation (if this is a resignation), and more specifically Linus’ response to Hellwig’s earlier comments regarding rust and the Linux kernel.

https://lore.kernel.org/rust-for-linux/CAHk-=wgLbz1Bm8QhmJ4d...

(Edit to include the correct name)


Linus seems unusually reasonable and kind in that response.


It seems his leave of absence has given fruits.

Old linus would have ticked off for even less, along with name calling and personal attacks.


I agree – I'll need to look this up myself, but I'd love to know what his self-reflection on his annoyance involved: I admit that I still need to do the same to learn restraint during disagreements with new collaborators, that I haven't had the experience to feel trust with yet. My anger's stopped me from productively working in informal standardization efforts within my org.


The only person to get to have it both ways it seems is Linus. In the past he's railed against c++ and it would be cold day in hell to see c++ code in linux.

To be fair, it's his tree. On the other hand, I am a little worried that Linus has been hypocritical in the past few years, but most of us get soft as we age I guess.


What is both ways about it here? The argument against C++ was never about a multi-language codebase being unacceptable.


Linus seems to genuinely believe that Rust is superior to C++.

Having used both, I agree.


Aisler is pretty good for prototyping. Wurth also supplies PCBs (with fantastic quality.)


Sometimes we need to give awesome tools to creative people and see what they come up with, even when we don't understand the implications ourselves.

I think millimeter accurate GPS is one of those tools. It has the power to enable so many things. Things we cannot imagine without using the tool itself.

40 cm vs 1 mm is the difference between landing a quadcopter smoothly or crashing it into the ground.

20 cm vs 1 mm is the difference between a robot navigating through a door or crashing into the wall.

20 cm vs 1 mm is the difference between mowing the lawn or cutting through your flower bed.

Unfortunately it doesn't look like we'll be getting millimeter accurate GPS anytime soon. The Genesis satellite might be a prerequisite though.


The government artificially limits civilian gps resolution. Millimeter gps would be a boon for hellfire R9X type explosive free ballistics.


This hasn’t been true for 24 years. Selective Availability was discontinued in May of 2000.

https://en.m.wikipedia.org/wiki/Global_Positioning_System


Why reveal your entire hand?


I've been trying to figure out how end-to-end encrypted communication is supposed to work in these apps. From what I can gather you need two things, a central server, and public key encryption. To start a conversation your first task is getting the public key of your intended recipient. This is supplied by a central server that acts as a public key repository and message relay/store. Then you can send your message by leaving it at the central server for later delivery to the recipient (encrypted with their public key.) This is also the start of some form of key exchange in the hopes of switching to symmetric encryption for future communication.

I see problems with this setup. The central server is responsible for relaying communication since there is no directly link between those trying to communicate. It is also responsible for handing out public keys. It is literally a man in the middle. What is stopping the central server from lying about the public keys? What is stopping the server from decrypting everything?

Hopefully my understanding of this is wrong. It is certainly incomplete.


Maybe you've been living under the same rock as me. I was under a similar impression untill I decided to learn cmake last week.

I was blown away by its simplicity and power.

You can download almost any C/C++ library from GitHub then add two lines to your cmake file. It will then compile and link with your project.

Blown away.

Cmake can generate project files for Visual Studio, makefiles, and countless other formats. Better yet, you can open a cmake folder directly in Visual Studio. No need to generate a project :)

This comes as no surprise to anybody, I'm sure, but cmake is great.

It was tricky to find good information about it though. It has been around forever, but the nicer features were added recently it seems.


STL works everywhere. Is trivial to parse. And can contain both colors and units [0].

Is there anything stopping us from having multiple solids per file? If not, I don't see the reason for another format.

The mentioned benefit of having slicing settings in the file will not work. Slicing settings are not portable between machines. And not portable between different kinds of filament.

Can someone post the XKCD about additional standards? :)

[0]: https://en.m.wikipedia.org/wiki/STL_(file_format)


> Is there anything stopping us from having multiple solids per file? If not, I don't see the reason for another format.

I think TFA offered a pretty compelling argumentation why you should consider using 3mf for 3D printing and I think you glossed over it:

> 3MF provides a clear definition of manifoldness — it’s impossible to create a 3MF file with non-manifold edges, and there is no ambiguity for models with self-intersections.

Even this is enough of a reason for me to prefer using a 3mf if available instead of having to fix holes in a godawful mesh editor. "STL works everywhere" is true only if you consider incidentally non-manifold STLs as an issue with the software that produced them and not the format itself.

I would like to add another technical detail that I don't think is included in the article -- 3MF uses curved triangular tessellations to encode geometry. This means more accurate representations of geometries and smaller file sizes even with high detail.


I would consider that an issue with the software. Non-manifold models, or self-intersecting models, are not suitable for 3D printing. This is an issue with the model rather than the file format.

If 3MF somehow makes self-intersection, and non-manifoldness, impossible cannot we run STL files though the same algorithm to end up with a "fixed" mesh?

What happens if you try to save a non-manifold model as 3MF? Will it magically fix the issue or will it fail to generate the file?


...yeah I have to agree... xml is a bit trash, STL works and I don't see how the file format itself can preclude invalid structures, and as others have pointed out the whole text versus binary means a massive slowdown and inefficiency...

The algorithms and the vector compressions could likely be salveaged, but the whole positioning as a 'replacement' to STL seems like red herring marketing, with lock-in to whatever stdlibs they are providing.

It would have been better as a standard binary format, there are a number of choices...



Does anyone know of a similar product? I'm particularly impressed by the built in UPS and five bay storage.


I've built a couple NAS machines off Atom Mini-ITX boards, and in one case I supplied the mainboard and the two disks using a PicoPSU 12V adapter. Once the system accepts 12V, the UPS can be a car/motorcycle battery left inline as buffer, so that in case of blackout there's no switching involved. Not compact as the Helios64 UPS, but it works and is less risky than having two Lithium cells near the drives. I didn't try this configuration as my PicoPSU needed quite accurate 12V input, which the car batteries would exceed when fully charged, but their M3-ATX Automotive model accepts from 6 to 24 volts making ideal, as the name implies, for being supplied by a car battery.

Regarding the 5 bay storage, you can buy backplanes with pull out bays for 5*3.5 disks that would fit into 3*5.25 bays. A search for "3 to 5 sata backplane" returns some products.


Doesn't exist AFAIK unless something new was announced in the last couple of months that I missed - I researched quite a lot a little bit back.

Somewhat similar scopes but without both of those:

https://wiretrustee.com/raspberrypi-cm4-sata-board/ (not launched yet)

https://shop.allnetchina.cn/collections/sata-hat/products/pe... (they also have an enclosure kit)

https://www.hardkernel.com/shop/odroid-hc4/ (just 2 drives; I have one as a sink for automated ZFS snapshots using zrepl; a friend is using one for syncthing+apple time machine sink)

If you want something within the coming ~2 years and any of the above don't fit your bill I think you'll have to resort to x86 and an external UPS. I'd love to be proven wrong, though.


I have a somewhat similar project based around the raspberry pi. It's rather DIY compared to Helios64's project - 3d printed case, hand soldered electronics, amazon sourcing - though the goals are similar. Specifically, my goals were to host 2x 3.5" drives and to be printable on a standard 200x200 mm 3d printer.

https://old.reddit.com/r/DataHoarder/comments/n277ip/raspber...


Yeah that's a great idea for saving space and would be perfect if it automatically shut down after the battery gets low


How is this different from the origin header? Does the origin header not tell the webbserver if the requested originated from the same website? Is the origin header flawed in some way?


Reading the documentation on MDN[1] it looks like it sends more data than just the Origin of the request. Metadata headers include if the user initiated the request (e.g. navigation or click events?) and how the data is meant to be used (e.g. as audio data for <audio> or a top-level document).

This spec seems really powerful, provided all browser support it :)

[1]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers#fe...


Firefox, Chrome, Edge and Opera support it (including mobile).

Internet Explorer is dead (ok, is a Zombie. But was supper-seeded by Edge for most users).

Safari is sadly not yet supported.

The nice thing is that you can employ security enhancements based on this technique even if it's not supported by all your clients.

I.e. you can automatically reject requests if the headers are given and have a bad value, which would add additional protection against certain attacks for all users except the ones stuck on IE or Safari.


Safari truly is IE in 2021


This is a story that you can often hear on HN but I don't think it's correct. There were three correlated reasons for the bad reputation of IE some years ago:

1. it was largely dominant, so people thought they could develop just taking that browser in consideration

2. for the previous point, MS started to develop proprietary features (like ActiveX)

3. at a certain point its development was stopped for a long time

Safari certainly cannot match the first two reasons. But it cannot match the third either, because the development of standard web features is going on at good pace (see <https://webkit.org/status/>).


Those might be the reasons why users disliked IE, but the reason developers dislike IE were/are somewhat different:

1. It doesn't support many of the latest web standards

2. A large enough percentage of users use it that it can't be simply ignored

Both of those points apply to modern Safari. Less so to IE these days as #2 becomes less and less applicable; hence "Safari is the new IE".


Developers hated having to work around missing features for IE even when FF and Chrome took over the market, Safari is the exact same, except you can't even update the rendering engine on iOS, Apple doesn't want webapps to eat away at app store profit (notice how shitty and slow moving the webgl/webgpu thing has been mostly due to iOS Safari)


Counterpoints:

> Safari certainly cannot match the first two reasons.

1. Most users view websites on their phones. Safari is the only browser on iPhone (there are other browser skins, but they're all forced to run on top of Safari). The market share of iOS devices is usually about at least 50% in developed nations.

2. iOS has proprietary features, it is known as the App Store. If you want to develop certain things, you must use the app store, the browser is locked out of those features (even if all other browser vendors have them).

> But it cannot match the third either, because the development of standard web features is going on at good pace (see <https://webkit.org/status/>).

3. I probably don't need to go into this point since it's common knowledge that Safari has always been the least compliant browser in terms of web standards. Their history of holding back features or implementing features with critical flaws that make them useless has been a recurring trend for the last decade. Just because they have checked a box on a table, doesn't mean the feature is anything close to useable.


Right now but you can reasonably develop a website which will work in Chrome and Firefox even without testing (not talking about any supper modern features), but Safari is riddled with bugs you wouldn't expect. Recently I have encountered multiple bugs regarding svg clipping in safari. Safari 14 also broke localstorage and indexeddb it's almost funny how bad safari is at actually just working.


My homepage is tiny and absolutely nothing fancy, but I've still managed to immediately run into at least two Chrome bugs.


Well I found some chrome bugs, but here is the thing after I reported them, they have been immediately responded to and fixed and released in nearest version. Safari though you have to wait a year for bug fixes to be released, if they even acknowledge you at all inside their bug tracker the only way to get Webkit people attention is tag them on twitter.


I don’t quite understand this argument. Can you give me a couple examples of Safari holding back major parts of web design? Or is it more obscure stuff like some webGL engine?

Because I use Safari specifically for privacy reasons and it also used to never trigger my fans to full speed just to play videos, like Chrome. I also have read that while Safari does tend to take longer, their implementations tend to be more polished. But this was more like a tweet so take that anecdata with a grain of salt.


I often try to use a feature and it doesn't work properly on one browser. It's nearly always mobile safari. This week, I've dealt with scroll-snap (which makes URL anchors work correctly with a sticky header) being supported but only for some layouts (every other browser works).

Today I spent hours debugging why pages with a particular iframe embed would log you out of the parent site on Safari / iOS. Possibly because the same first-party resource was requested from both outside and inside the frame? Not sure yet.

If you attempt to use localStorage from a private tab on safari, it reports that it's present and working, but raises an exception on any access (every other platform either does not expose localStorage in private tabs, or clears it after).


Absence of push notifications makes it impossible to develop a lots of web apps.

Also more user-friendly way to install desktop shortcut would help tremendously to make web apps more popular. Of course Apple is not interested with that, but it's still sad.



An an user, safari is great. Has a web developer, safari is hell.


I have done web-development both in the bad IE days but also recently and IMO it wasn't as bad to develop for IE as it is for Safari today. Safari is broken in strange and random ways and missing odd features and is a moving target (and seem to break more with time). Developing for IE was extremely well documented (especially in later versions) and avoiding pitfalls was very easy, even for people new to creating webpages using a few Google searches. Not so for Safari - unless you cut it completely off from all modern advances on the web. It just felt worse back then because IE was much more widespread.


Were you actually doing any serious web development in the mid 2000s? Because I have a feeling you wouldn't be saying this if you were.. :-)


I can see that came out wrong. I'll try to rephrase...


And even for the same reason: If the browser was too good "no one" (very loosely defined here) would need to buy Apps anymore. :(


Let's completely sidestep the whole debate that we always have. This is a safety feature, Safari will implement it, you can bet on it. It's merely going to be the last to do it.


And yet safari is the last major browser to implement it.


People who say this never had to develop for IE.


Yep, I don't know why my first thought was that malicious actors could just bypass this by using external HTTP clients (like curl) when in fact this spec is meant to augment CORS: browsers _will_ send these headers to the server and the server can choose to honour them or not (well, in the CORS case the browser will block the request if the response headers are incorrect).

It's defense in depth :D


> supper-seeded

I think you meant superseded (pronounced super-seeded).


Fascinating, TIL that superseded is correct and superceded is and has been wrong for four hundred years :):

https://www.merriam-webster.com/dictionary/supercede


It seems silly to me too but re reading https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Or...

“ There are some exceptions to the above rules; for example if a cross-origin GET or HEAD request is made in no-cors mode the Origin header will not be added.”


That's an interesting find thanks. I was not aware of no-cors mode.

It seems though that a browser would not allow 'non-simple' headers in no-cors mode[0].

Authorization headers for example would not be allowed (if i'm reading correctly). So any API using that header would not be affected by this issue right?

[0] https://developer.mozilla.org/en-US/docs/Web/API/Request/mod...


If all the parts of the site are at the same place, then checking an origin header would probably do the same thing. This seems to be adding semantics for when the frontend is requesting data from a different backend, as well as for specific types of content, and if it was based on a user action.

The user action part is very nice if it can't be overwritten with just javascript. The other parts I'm not sure what the browser is helping with, that can't just be done with standard headers.


> Is the origin header flawed in some way?

tl;dr yes. It's not always sent.


This is true. Could we not disregard requests without an origin header?

According to [0] we can force CORS behaviour be using a non-simple request in our webapp. By setting the mime type to JSON for example.

0: https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS


Be warned: EV-code signing certificates do not work as advertised. There is no instant reputation. My company tried this and it does not work.

I'm not sure what to do anymore. We are a small company with few customers. Slowly gaining reputation over time does not seem like a viable path.


Also a small company. The ev cert worked for us. Are you having trouble with smartscreen or av products?


That is great news. Maybe there is a difference in provider? Or perhaps we wasn't patient enough.

We have problems with smart screen.

Can you recommend a certificate provider?


We used digicert. There is a 50% off link if you Google around for it


Thanks!


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: