Personally I don't trust anything that I don't self-host. With Hetzner I chose full-metal instead of VPS. I pay €40 for 4 TB (2x4 TB mirrored to be precise) shared among a large number of services, including Minecraft servers for kids. The load is very low and the maintenance is a pleasure. Plus I love tinkering with and learning new things, so this part of mine is fully satisfied. I learned with Hetzner that every 3 years or so a hard drive dies but they replace it within half an hour so I just let the RAID rebuild itself and that's all.
Depends on what you’re using it for. I set up a Matrix homeserver on a hetzner box in Finland for primarily North American users. Latency hasn’t been a perceptible problem for that.
It is a multi-faceted issue. In general, for really sensitive content, like nude photos, keys etc., I'm very strict they should always be stored in an encrypted form and never ever decrypted on a remote machine.
Other content, private but not sensitive, like holidays photos, can be stored and decrypted remotely and it's a good practice to use encrypted partitions by default. I don't believe Hetzner would do any of the grey things that Google, Facebook and now Apple are doing (i.e. actively scanning your data for advertising and other purposes), but there is a practical problem of broken hard drives - it doesn't matter if it breaks in your place or at a hosting provider, you have a hard drive that is broken but you can't remove your data from easily, but someone else might. I estimate the probability is extremely low, but there is very little downside and effort required to encrypt data partitions nowadays, so I don't see why I shouldn't do that.
As for the interception of data in transit, nowadays everybody is using TLS for everything, so I don't think it's an issue.
Previously I was using various projects for these things like Coquelicot as a WeTransfer alternative but practically speaking Nextcloud is so easy to use (and relatively stable now) that I use it for most things that need to be shared with others:
* it's intuitive to use, so people who haven't seen it can quickly download what you send them without having to log in, being tracked and so on.
* password protection and expiry date for shared links come in handy
* superfast search is a boon for me as I have so many files it's probably the most important aspect
* the photos app in Next/ownCloud is underperforming if you have a large number of photos in a folder - the thumbnails seem to be generated each time I open the folder, it's probably a bug.
* when you need to collaborate, it's supereasy to add users and they intuitively know how to do things as they're used to Dropbox etc.
I remember meeting with KimDotCom lawyer a few years ago in SF. A very flamboyant-person. One of those guys who makes a big impression on you from the first sight.
While KimDotCom is an impressive hacker, I am not sure that I would love to be the target of the FEDs around the world as he is still in the eye of the US.
Fun story about MegaUpload.
I once lived in Argentina back in the MegaUpload days. At the time, piracy was the norm (not only in Argentina), the gov didn't care, and people where selling pirated, burned DVD on the streets. This was a downtown, a high transited area.
Then MegaUpload started to grow like fire, and I remember that starting at 4 pm, the internet would get awfully slow. As people get off their jobs to download the latest movie or episode out there. Then PopcornTime, and things got even worst. Cant find the stat, but I remember something along the line of 60% of Buenos Aires traffic being MegaUpload's at peak time (4-10pm), which caused a lot of controversy at the time.
Mega and Stingle are the only options I'm aware of that are "truly" private, and every other major cloud storage provider will scan your files for CSAM. The best way forward for truly privacy conscious people would be to roll their own NextCloud instance, because any public service that allows you to store E2EE images will get hunted by the Government for allowing CSAM if they reach any significant userbase. Case in point, even good ol' MegaUpload scanned for CSAM, because that's a bigger risk than getting sued by the MPAA.
Dumb question but would an S3 bucket be scanned by CSAM?
Ehh idk seems like a dumb concern/losing battle. My thing is about IP. I picked up this e-ink tablet and it syncs to their cloud service for example. Which you can stop but still... Ahh. Just feels like going in circles, ISP knows your content, VPN, is your device actually secure, etc... Do you have anything to hide anyway.
I'm probably just paranoid ha I question using Trello putting "new IP" into it which they say is encrypted at rest so yeah. Gmail too like everything goes through that.
Anyway I'm average intelligence not developing cold fusion or something on my spare time so I don't really have IP anyway.
> Dumb question but would an S3 bucket be scanned by CSAM?
Maybe, but if you E2EE it can't detect anything, anyway. (Please don't take this as an endorsement of child porn)
NextCloud has preliminary E2EE support built in, but not for photos - yet.
> I have always used Mega[0]. It's real end to end encryption. I would argue it also has a superior user experience across all the others.
I'm still waiting for it to suffer the same fate as MegaUpload. The reputation has the service tarnished. I don't care that everything's encrypted with Mega - They could still sneak in backdoors or skeleton keys. Plus it needs Javascript to work, which is a privacy nightmare.
Thanks for mentioning this! It turns out that that I have an account from 2014 that has 50GB instead of the 20GB offered now in the free tier. To those interested in very basic photo viewing: the Mega apps are more than capable of sorting by the actual date when the photos were taken, which is all that I was really interested in. The web interface doesn't seem to have this at a glance.
except if the key to its encrypted-at-rest hard drive is not used to do encrypting/decrypting on same CPU as it’s hard drive (that being, done remotely)
I have been using (the open source) Syncthing for both personal things and business for many years. You don't need to set up a central server, just share folders via other clients.
If you do, however, set up a server, you can centralize your data and easily run all the backup jobs from it.
I’ve been self-hosting this for a few months, it’s really good. I was disappointed at first when I realized it didn’t have native apps, so uploading new photos was a sub-par experience. But then I found the app PhotoSync which has Photoprism integration. Would be nice to have it all in one but I’m happy.
I am also eagerly waiting for Photoprism to hit stability and add people recognition to mostly move away from iCloud. They have an issue that is being worked on since 2018 to add support for people recognition :-)
You want an end to end encrypted cloud provider like ProtonDrive or Sync.com. You can also consider using rclone or Cryptomator to encrypt data before adding it to a consumer cloud sync solution. All come with the usual caveats (do you really trust Sync.com to withstand a state actor?) and user experience compromises.
Is there noticeable performance difference/loss of functionality? I did some encrypted text stuff before at rest and remembered to do a basic SQL select you have to decrypt everything (say column) first and then be able to search into it. So there is an added overhead... Maybe it doesn't matter with cloud processing or something.
I similarly use Tresorit (https://tresorit.com/) and share photo albums but these are general cloud storage as opposed to being optimized for viewing photos in a browser.
I gave Stingle a cursory try, but I was immediately put off by the fact that it was not even able to retrieve EXIF data from images and use that a basis for sorting. The handful of images I uploaded as a test we're all dated to the date I uploaded them as opposed to when they were actually taken, and there was no way to change that either. There was also no way to view the EXIF data as well.
Thanks for the link - I signed up for a free 2 GB Nextcloud with one of their partners. Experimenting with non-iCloud hosted notes now. It's a bumpy ride, lots of room for improvement.
Nothing come close to icloud photos for me. Because everyone else is using iphone which make icloud shared photo album the easiest way for old people to interact.
I’ve been looking strongly at Plex’s photo support. The images are self hosted, but the service acts like a DDNS, so you can easily get to it. I run our current Plex system off of our NAS, but have been thinking about moving it to a dedicated RasPi.
I like Mylio[0] - it's local only but syncs p2p style across local network. Or obviously you can sync folders with Whatever Online Service you choose, Mega or whatever.
Yep. More precisely Moments [0] which is great for photos and videos. But the Synology app ecosystem as a whole isn't bad at all. Just remember to buy a NAS with an x86 processor so you can also take advantage of docker based apps. I believe the DS220+ is a good device to start with if your budget is around $300.
I get that CSAM is real and is hurtful. So are many other things: slavery, murder, etc etc.
But is the real-world instance of CSAM so high that every photo on every phone that is shared with another human needs to be scanned for it? It seems like there is no tradeoff too big to not be taken in the quest to eradicate CSAM.
Are the resources spent really proportional to the threat?
Apple implemented a system where they sent to their server on an insecure protocol information that identified what applications you are opening. This shows everyone how much Apple cares or is capable of implementing privacy.
Emails reveal during the Apple vs Epic trial revealed that Apple store apps were infected, then Apple prepared an email for the users to inform them but Timmy decided that informing the users about possible security or privacy problems is bad for PR. We need to keep reminding or informing people about the reality and not the myth of Apple.
Yes I know you use Apple because Google shit is probably worse but Apple shit is still shit, Apple hypocrisy is still hypocrisy and Apple greed is still greed.
Google Photos has started metering the usage. only 16GB free. will fillup in a year, for most of us with recent phones and camera-autosync-to-GP enabled.
I would be classified as a pedo if my pictures were on iCloud... even if some of the naked kid is me... I also have pictures of my kids playing in the bath.
Fortunately that’s not how it works. You would have to have a whole collection of known and identified CSAM to be classified as anything at all with the system Apple has announced.
No, their AI accepts some deviation. And we don't know how many false positives it takes to start a human review. And for privacy of course different people will evaluate different photos. And a human without context will flag a picture of a kid on the bath as child pornography. It is a very real possibility OP is made a suspect because of this.
Even if we accept that your image of your kids in the bath will match the hash of a known and identified CSAM picture (a real stretch), under this system the voucher payload does not contain the private key to decrypt the picture, so nobody will be able to evaluate a photo or flag it as anything. Humans have access to a “visual derivative” based on the perceptual mechanism used to create the hash, but not to the actual photo.
Other companies have CSAM scanning and reporting with much fewer safeguards and the “my kid in the bathtub” scenario hasn’t seemed to actually be a problem.
No? The scanner only checks for matches against images in NCMEC's database, so you can take as many pictures of your kids as you want, it won't trip the scanner.
Considering it's a perceptual hash match, some difference is accepted. Such confidence in the infallibility of these methods usually precedes the method being used to go jail some poor innocent bastard. After all, if he wasn't guilty, it wouldn't have tripped the scanner.
You're not going to go to jail based off tripping a scanner, however a judge will authorize a search warrant based off tripping a scanner. If you trip the scanner the police will be paying you a visit with a search warrant in hand. They will confiscate all your electronic devices. Come to think of it, that's going to be interesting in the new world of work from home - because those devices are going to include your employer's devices. It'll be interesting to see how all that works out.
NCMEC's database is constantly growing; Facebook reported 20 million instances of it over the past year. Say John Q Pedophile has a collection, and his collection includes a bunch pictures which have been reported elsewhere mixed in with the original stuff he's producing, if you detect those images you have probable cause to bring him into custody without completely infringing on someone else's ability to take pictures of their kids on the beach.
Dropbox does automatic photo backups on Android and iOS. It also allows you to share files and folders, though I'm not sure how it compares to Apple Photos for "boomers"
Checking for CSAM isn't the issue. The issue is that Apple's system design easily extends to checking all sorts of other things - political dissidents, journalists, etc.
Not only that, it commits a felony when it transmits found CSAM to a party other than NCMEC (i.e. Apple itself).
It does. However people have a greater expectation of privacy on a device they own than on a cloud-based solution where they have voluntarily uploaded content.
> For the conspiracy to work, it'd need Apple, NCMEC and DOJ working together to pull it off voluntarily and it to never leak. If that's your threat model, OK, but that's a huge conspiracy with enormous risk to all participants.
My threat model right now is to trust no one except (1) people I know personally, (2) people they know and trust personally, (3) people who have proved their reputation for integrity publicly, and (4) well-designed systems built by either (1), (2), or (3).
I know someone at Apple who knows their head of privacy... so #2 may be in question, given the design of this system and its capability of further compromising the privacy of millions of Chinese citizens on the Chinese government's whim (and any other strongly-authoritarian government).
A few years ago, the hysterical nerd privacy crowd was clutching pearls and waving hands because Condolezza Rice was going to turn you Dropbox files over the the NSA.
“IANAL” is used to avoid liability for offering legal advice in settings where it’s not permitted. It’s not used to hand wave away that you don’t know what you’re talking about.
The very same law you’re citing describes, in detail, the good faith diligence process that requires the service provider to verify the suspected material before transmission to NCMEC. But no, some random blog and you, you two have a handle on legal analysis that the most litigiously sensitive entity on Earth must have missed while designing one of the most litigiously sensitive systems ever fielded by humans. How’d they miss felony criminal liability, right? It’s just too easy to overlook while designing a system whose sole purpose is to gather legally actionable evidence against other people.
As someone who’s built these systems for over a decade, it’s remarkable how one Apple press release can make everyone so hopelessly uninformed and confident that they know the score. Nobody used the acronym CSAM until a week ago except for people like me, those of us haunted by (actual) nightmares of this shit while HN distantly pontificates on the apparent sacrilege of MD5ing your photos of a family vacation to Barbados to see if you happen to be sharing images of children being raped.
Nobody commenting on this has ever seen child pornography. I’d take that to the bank. Did you know the organized outfits design well-polished sites like PornHub, complete with React and a design palette? 35 thumbnails of different seven year olds right on the front page, filterable by sexual situation. Filterable by how many adults are involved. With a comment section, even!, and God help you if you even begin to imagine what is said there. You’re right, though, let’s think about your privacy and the criminal liability for Apple for taking action on something that clearly doesn’t matter to anyone except those stuck with dealing with it.
Get real. Sometimes the lack of perspective among otherwise smart people really worries me, and this conversation about Apple’s motives for the last week or so has worried me the most yet.
> “IANAL” is used to avoid liability for offering legal advice in settings where it’s not permitted. It’s not used to hand wave away that you don’t know what you’re talking about.
It's exactly and only the latter, actually. Consider how useful an obscure-to-normies Web-forum acronym consisting mostly of "anal" is going to be at deflecting liability—should such even be possible—if it comes up in court. Not a bit, right? So how could it be intended for that? If that were the purpose, people would write out the words.
People should really stop using this “conspiracy theory” as a reason for Apple to not scan for CSAM in a privacy-preserving fashion. There are way too many “hot takes” that don’t take into account any legal ramifications of their “what if” scenarios.
I appreciate Apple’s design here, and I think that there’s an overreaction to it.
This is probably the best design we have so far for something that everyone else is already doing, and I give Apple credit for going to greater lengths to preserve privacy.
But the “just trust us, we only want to do good things and we will be ruined if you ever catch us doing bad things” rationale doesn’t help. In fact it sends me right back into protest every time I’ve seen it posted.
There will not be legal ramifications for “what if” scenarios. Not enough to prevent abuse.
Especially if these would be the same (weak) legal ramifications that prevent people from being wrongfully accused of murder, arrested for peaceful protest, or bank accounts frozen on baseless suspicion of fraud or terrorism.
From the same government that has treated legitimate political beliefs and entire religions as terrorism.
If the core defense is that I should just trust that NCMEC exists for a single purpose, will never be manipulated or expand outside that purpose, and is completely uninterested in carrying out any other agenda, then that defense has already lost.
Because that exact scenario has already occurred with other government agencies.
And suggesting that NCMEC is somehow at such a disadvantage in power that Apple has a choice to say “no” (and that Apple will do so at even the slightest hint of impropriety) and that alone will bring NCMEC and all of the good work they do crashing down?
I bought this reasoning fully with the Patriot Act and warrantless wiretaps. I had no doubt these things were being used to do good and prevent deaths, and I’m sure they have.
But I have also since seen enough to know that short term good will was paid for by the long term ruining of innocent lives, and racial profiling that continues to this day.
I’m not interested in supporting that again.
I’m good with “Apple is trying to steer this in a more privacy respecting direction” and “this introduces a new avenue in which NCMEC introduces more checks and balances by having third parties double check their work”.
I’m saying this as someone who is torn about this issue.
But the “separate government organization that will absolutely not bend to pressure and will suffer legal consequences if they do, JUST TRUST US KTHX” reasoning already has such strong precedence of being proven false, it only works against your case.
And the manual review would catch other types of photographs.
And the law clearly states that that doesn’t apply if an immediate effort is made to involve law enforcement. Besides, Apple is not transmitting it to Apple, the user is.
And in the same way a human resembles a butterfly. didn't you learn from the case where algorithms matched white noise with some copyright material? There are many iPhones and many images in each iPhones, there will be false matches. Add on top of that that the database of hashes is secret, that it can be updated in secret, that the algorithm is secret and the "threshold" is also secret and you have a lot of suspicion from people.
No, not in the way a human resembles a butterfly. But for the sake of argument, fine: some pictures of you look exactly like pictures of a butterfly, enough to trigger the flagging threshold. These false positives would easily be caught during the review and nothing would happen.
I do not trust the review people, they are not even Apple employees, they are cheap contractors that hire cheap labor,this people are treated like crap so I can see them making mistakes , is not like you never heard of Apple review guy decided X and when other guy checked he decided !X .