closed source operating system. on mobile data there is no way to reliably view/block requests to Apple servers (such as firewall/wireshark). even assuming they dont have some sneaky hardware-level spyware like intel IME…
Well, it would most likely show up in the network traffic if they were doing that for starters. And no one doing security analysis on iOS has ever mentioned that AFAIK. And since Apple bases about 90% of their marketing on protecting your privacy, that would be very bad for them as a company.
I mean, what’s stopping someone poisoning a library on open source? That’s actually provably happened.
Which is not to ding open source, which I quite like too. Just saying you are running certain risks no matter what you choose and in a phone OS, if you just want it to work and not think about it, I personally feel like Apple is a decent risk still.
Apple bases about 90% of their marketing on protecting your privacy,
LOL “We have access to all your data and use it to target ads and any goddamn thing else we want to do, but we don’t sell it to third parties, we just take the third party’s wishes and shove them down customer throats ourselves! It’s not much better, but it is better.”
you think apple wouldnt abuse customer data just because of its brand image? thats awfully trusting of a comany which has been proven to scan ‘private’ icloud images. most of their customers either 1) don’t care 2) will believe it’s somehow justified 3) will forget soon enough
the great thing about open source is that people can audit it. and for a big project like android (aosp, grapheneos, etc – as a parallel), people will. any new commits will be analyzed by maintainers. of course its not impossible, but its a lot less likely than anything closed source, where developers are forbidden to disclose any details to the public.
but if youre willing to use siri and icloud despite the privacy concern, that is fine; every solution is a compromise.
Scan private iCloud images? What part of the E2E did you miss? Also, if this is the plan I think you’re talking about for CSAM, they actually abandoned that, even though it was a pretty decent plan…
so because they say that they wont scan your images, you just trust them? the fact that Apple had planned to is evidence enough that they could and possibly do. again, there is no way to prove that they don’t.
do you understand what i’m saying when i say “e2ee is almost meaningless on a closed source app”? you are taking their word on whether they know your private key, or even encrypt your data at all. to encrypt a file properly, use a local opensource program (gpg) before ever letting Apple touch it.
btw, have you heard of the case where a persons picture was flagged as csam, when it was sent to the kids doctor in lockdown? these filters are not perfect, and can ruin someones reputation. any pedophile with even a glint of common sense would avoid proprietry spyware (iCloud) anyway, or at the very least encrypt manually.
again, your privacy is being eroded in the name of “saving the children”.
Everything you’ve said aside from the CSAM scan doctor thing has absolutely nothing to back it up so far. (And for the record, I absolutely agree CSAM scanners can be wrong—a human needs to be involved at some level, which they were in the system Apple devised. At any rate, I guess this convo is over as we obviously inhabit very different worlds.
A lot of Siri requests are processed locally, to be fair. And iCloud has encryption now.
both backdoored
How is the local stuff backdoored?
closed source operating system. on mobile data there is no way to reliably view/block requests to Apple servers (such as firewall/wireshark). even assuming they dont have some sneaky hardware-level spyware like intel IME…
Jailbreaks exist and allow stuff like wireshark to run on iOS.
Lol.
https://support.apple.com/guide/security/advanced-data-protection-for-icloud-sec973254c5f/web
e2ee is almost meaningless on a closed source app or system… whats stopping them from sending a copy of your files unencrypted?
Well, it would most likely show up in the network traffic if they were doing that for starters. And no one doing security analysis on iOS has ever mentioned that AFAIK. And since Apple bases about 90% of their marketing on protecting your privacy, that would be very bad for them as a company.
I mean, what’s stopping someone poisoning a library on open source? That’s actually provably happened.
Which is not to ding open source, which I quite like too. Just saying you are running certain risks no matter what you choose and in a phone OS, if you just want it to work and not think about it, I personally feel like Apple is a decent risk still.
LOL “We have access to all your data and use it to target ads and any goddamn thing else we want to do, but we don’t sell it to third parties, we just take the third party’s wishes and shove them down customer throats ourselves! It’s not much better, but it is better.”
you think apple wouldnt abuse customer data just because of its brand image? thats awfully trusting of a comany which has been proven to scan ‘private’ icloud images. most of their customers either 1) don’t care 2) will believe it’s somehow justified 3) will forget soon enough
the great thing about open source is that people can audit it. and for a big project like android (aosp, grapheneos, etc – as a parallel), people will. any new commits will be analyzed by maintainers. of course its not impossible, but its a lot less likely than anything closed source, where developers are forbidden to disclose any details to the public.
but if youre willing to use siri and icloud despite the privacy concern, that is fine; every solution is a compromise.
*blink blink*
Scan private iCloud images? What part of the E2E did you miss? Also, if this is the plan I think you’re talking about for CSAM, they actually abandoned that, even though it was a pretty decent plan…
so because they say that they wont scan your images, you just trust them? the fact that Apple had planned to is evidence enough that they could and possibly do. again, there is no way to prove that they don’t.
do you understand what i’m saying when i say “e2ee is almost meaningless on a closed source app”? you are taking their word on whether they know your private key, or even encrypt your data at all. to encrypt a file properly, use a local opensource program (gpg) before ever letting Apple touch it.
btw, have you heard of the case where a persons picture was flagged as csam, when it was sent to the kids doctor in lockdown? these filters are not perfect, and can ruin someones reputation. any pedophile with even a glint of common sense would avoid proprietry spyware (iCloud) anyway, or at the very least encrypt manually.
again, your privacy is being eroded in the name of “saving the children”.
Everything you’ve said aside from the CSAM scan doctor thing has absolutely nothing to back it up so far. (And for the record, I absolutely agree CSAM scanners can be wrong—a human needs to be involved at some level, which they were in the system Apple devised. At any rate, I guess this convo is over as we obviously inhabit very different worlds.