Ogg wrote to All <=-
What say you about Apple's plan for your phones?
Oli wrote to Avon <=-
I know why I never 'owned' an iPhone. Apple is a control freak. But
people (especially in the US) embrace surveillance and don't care that much about privacy. Even more to 'protect' their children.
What say you about Apple's plan for your phones?
As I always say, whenever there's any sort of a backdoor
crackers will find it and exploit it. Not good.
Ogg wrote to All <=-
What say you about Apple's plan for your phones?
Ogg wrote to N1uro <=-
Who would supply the photos? I would suspect that the CIA or
NSA are heavily involved in the project.
Ogg wrote to N1uro <=-
I see this as a golden opportunity for the unscrupulous to
sabotage other people (politicians, dissidents, competitors,
etc) by injecting suspect photos onto victim's phones. And
when this "false positive" is detected, the victim's account/
phone is suspended until they appeal. Even that process seems
backwards. Why should "the victim" appeal when it hasn't even
been determined that the photos were genuinely acquired in the
first place.
The system assumes that if such photos exist on your phone,
then you are immediately quilty. You can be sure that enemies
will try to sabotoge one another this way.
The analysis system claims that it will assign a hash for the
porn photos. I can't imagine what kind of human it takes to
even be involved in this project at Apple. They would have to
witness thousands of photos to confirm results.
Who would supply the photos? I would suspect that the CIA or
NSA are heavily involved in the project.
What would stop a 3rd-party app "injecting" hashes unto
unsuspecting phones?
Anyway.. I see a lot that can go wrong with the implementation.
Anyway.. I see a lot that can go wrong with the implementation.
As do I... not to mention countries where this idea is illegal in nature. Just not a good idea at all.
Exploitation of children goes beyond sexual events... it also includes such things as what Apple is claiming. Sad state of society we're in.
It's a worldwide database of known child pictures. From what I
read in the article they don't actualy /have/ the pictures, they
have some math doodad that can detect the pictures.
I read about it on engadget when they reported the new feature.
Shawn
It's a worldwide database of known child pictures. From what I
read in the article they don't actualy /have/ the pictures, they
have some math doodad that can detect the pictures.
I read about it on engadget when they reported the new feature.
Shawn
My guess is they have hash #s on previously exising pics they have captureed from pizzagate computers and such. they probably keep the hash #s in the dat base and run the hash on the new pics they find and compair agains the existing hash #s. just i guess.
Thanks
- Gamecube Buddy
telnet --<{bbs.hive32.com:23333}>--
[...] they probably keep the hash #s in the dat base and
run the hash on the new pics they find and compair agains
the existing hash #s. just i guess.
If they are serious about pinpointing known bad pictures, I
suppose what they do is to use high level scanning for
analyzing properties of the picture and generate a hash out
of those properties, then compare such hash to a database
of known blacklisted pictures.
However, if that is the case, you increase the chances of
getting false positives. I suppose the idea is that they
intend an human operator to review any matching or
suspicious picture, but that is still not pretty, because
it means Apple is setting their systems up to have your
phone contents reviewed by humans.
Re: re: Apple will start scanning
By: gcubebuddy to Tiny on Mon Aug 09 2021 02:52 pm
If they use raw hashes for comparison, I think the scanner will be very weak.It's a worldwide database of known child pictures. From what I read in the article they don't actualy /have/ the pictures, they have some math doodad that can detect the pictures.My guess is they have hash #s on previously exising pics they have capt from pizzagate computers and such. they probably keep the hash #s in th base and run the hash on the new pics they find and compair agains the existing hash #s. just i guess.
I read about it on engadget when they reported the new feature. Shawn
Thanks
- Gamecube Buddy
If they are serious about pinpointing known bad pictures, I suppose what they do is to use high level scanning for analyzing properties of the picture and generate a hash out of those properties, then compare such hash to a database of known blacklisted pictures.
However, if that is the case, you increase the chances of getting false positives. I suppose the idea is that they intend an human operator to review any matching or suspicious picture, but that is still not pretty, because it means Apple is setting their systems up to have your phone contents reviewed by humans.
I have a new slogan for Apple: "Purchase if Masochist."
As I always say, whenever there's any sort of a backdoor crackers will
All that processing or transfers to the mother ship is going to
bog down people's phones.
Adept wrote to Ogg <=-
My understanding is that they're only scanning photos uploaded to the automatic photo uploading service they have, and that is turned on by default.
So the additional data and processing are probably minimal.
Arelor wrote to N1uro <=-
Do you know what they say of blackhat converences? They say that you shoul¤d never show up with a smartphone, because somebody will exploit
it and fill it with gay porn for fun.
The problem with automated remote scanners is that somebody may decide
to plant an image that will trigger an alarm and get you screwed. Apple itself has root on the device, for starters, but there are lots of ways
to inject unwanted files in a phone.
Ogg wrote to Arelor <=-
All that processing or transfers to the mother ship is going to
bog down people's phones.
Utopian Galt wrote to Tiny <=-
People are afraid that China, Russia or even our own state actors (USA/CAN/UK etc) will use it against us.
Tiny wrote to Adept <=-
Adept wrote to Ogg <=-
I don't use the icloud feature for anything. I have it fully disabled. Not because I don't trust it, but because I don't have anything on
my phone that needs to be backed up. If I take a picture it's mostly
to remind me to do something, or I saw something funny.
Ogg wrote to Arelor <=-
All that processing or transfers to the mother ship is going to
bog down people's phones.
and data ingestion will be an unfunded mandate for the user.
"I signed up for the lowest 1gb data plan, and half of that is used to monitor my camera roll..."
... No ceremonies are necessary.
poindexter FORTRAN wrote to Tiny <=-
Yeah, I go through my photos directory looking for artistic photos to
put on my web site and have to sort through dozens of photos of serial numbers, BIOS configs, cable rack photos, serial numbers and shipping labels.
Arelor wrote to poindexter FORTRAN <=-
People already pays for the bandwidth used to serve them more ads than content. I think that is not a problem for the consumer since they are already used to third parties raping their data plans in the butt until the data plan cannot sit anymore.
Why am I thinking of the story arc in "SIlicon Valley", when a proof-of- concept app to search for food by photo could only determine "hot dog" or "not hot dog", and he ended up selling it to snapchat to help filter out dick picks?
Then again, I grew up on an internet where people kept open SMTP servers as a community service and remember seeing the first SPAM on usenet. How far we've come.
poindexter wrote (2021-08-13):
Then again, I grew up on an internet where people kept open SMTP server as a community service and remember seeing the first SPAM on usenet. Ho far we've come.
I wonder how the next level of inefficient / bloated / annoying / useless wi look like.
A couple of articles, the first is mostly a reference to the second,
along with some additional insights:
https://www.reuters.com/technology/exclusive-apples-child-protection-feat ures-spark-concern-within-its-own-ranks-2021-08-12/
"Apple employees have flooded an Apple internal Slack channel with more
than 800 messages on the plan announced a week ago, workers who asked
not to be identified told Reuters. Many expressed worries that the
feature could be exploited by repressive governments looking to find
other material for censorship or arrests, according to workers who saw
the days-long thread.
"Past security changes at Apple have also prompted concern among
employees, but the volume and duration of the new debate is surprising,
the workers said. Some posters worried that Apple is damaging its
leading reputation for protecting privacy."
https://www.osnews.com/story/133821/apples-child-protection-features-spar k-concern-within-its-own-ranks/
"Its a complete 180 from Apples behaviour and statements (in
western markets) of course employees are going to be worried. Ive
been warning for years that Apples position on privacy was nothing
more than a marketing ploy, and now Apple employees, too, get a taste of their own medicine that theyve been selling in China and various
other totalitarian regimes."
Considering past reports regarding Apple and Google assisting the
Chinese government in their spying on their own people, I am surprised
that the Apple employees are just catching onto the idea that their tech "could be exploited by repressive governments" for nefarious means.
Maybe they just didn't believe the reports before now?
#
My understanding is that they're only scanning photos
uploaded to the automatic photo uploading service they
have, and that is turned on by default.
So the additional data and processing are probably minimal.
If that is the case, is that still less bad? It still seems
unethical especially when their algorithm could produce a false positive.
The only good apple is the apple my horses like.
Oli wrote to poindexter FORTRAN <=-
I wonder how the next level of inefficient / bloated / annoying /
useless will look like.
Arelor wrote to Blue White <=-
The only good apple is the apple my horses like.
Sysop: | CyberNix |
---|---|
Location: | London, UK |
Users: | 22 |
Nodes: | 10 (0 / 10) |
Uptime: | 02:16:32 |
Calls: | 892 |
Files: | 4,436 |
Messages: | 669,104 |