i don’t know why you worded your comment like we are in disagreement haha
samsung is forcibly inserting themselves into the chain of custody. in a world where cell phone video is finally spotlighting existing police brutality , the idea that my evidence could get thrown out because of some MBA’s idea of what constitutes a “real picture” is nightmarish.
It’s not an MBA thing, it’s a technological progress thing.
We’ve gone from photos full of “ghosts” (double exposures, light leaks, poor processing), to photos that took some skill to modify while developing them, to celluloid that could be spliced but took a lot of effort to edit, to Photoshop and video editing software that allowed compositing all sort of stuff… and we’re entering an age where everyone will be able to record some cell phone footage, then tell the AI to “remove the stop sign”, “remove the gun from the hand of the guy getting chased, then add one to the cop chasing them”, or “actually, turn them into dancing bears”, and the cell phone will happily oblige.
Right now, watermarking and footage certification legislation is being discussed, because there is an ice cube’s chance in hell of Samsung or any other phone manufacturer to not add those AI editing features and marketing them to oblivion.
In this article, as a preemptive move, Samsung is claiming to “add a watermark” to modified photos, so you could tell them from “actual footage”… except it’s BS, because they’re only adding a metadata field, which anyone can easily strip away.
TL;DR: thanks to AI, your evidence will get thrown away unless it’s certified to originate from a genuine device and hasn’t been tampered with. Also expect a deluge of fake footages to pop up.
proceeds to describe how MBAs (Samsung marketers and business leaders) are doing this with technology
Non-MBAs are already using the same technology for deep fake porn, including fake revenge porn, or to blackmail and bully their school classmates.
You seem to blame it on businesses, like Samsung, which is what I disagree with. All those MBAs are just desperately trying (and failing) to anticipate regulations caused by average people, that will be way stricter than what even you might want.
i don’t know why you worded your comment like we are in disagreement haha
samsung is forcibly inserting themselves into the chain of custody. in a world where cell phone video is finally spotlighting existing police brutality , the idea that my evidence could get thrown out because of some MBA’s idea of what constitutes a “real picture” is nightmarish.
It’s not an MBA thing, it’s a technological progress thing.
We’ve gone from photos full of “ghosts” (double exposures, light leaks, poor processing), to photos that took some skill to modify while developing them, to celluloid that could be spliced but took a lot of effort to edit, to Photoshop and video editing software that allowed compositing all sort of stuff… and we’re entering an age where everyone will be able to record some cell phone footage, then tell the AI to “remove the stop sign”, “remove the gun from the hand of the guy getting chased, then add one to the cop chasing them”, or “actually, turn them into dancing bears”, and the cell phone will happily oblige.
Right now, watermarking and footage certification legislation is being discussed, because there is an ice cube’s chance in hell of Samsung or any other phone manufacturer to not add those AI editing features and marketing them to oblivion.
In this article, as a preemptive move, Samsung is claiming to “add a watermark” to modified photos, so you could tell them from “actual footage”… except it’s BS, because they’re only adding a metadata field, which anyone can easily strip away.
TL;DR: thanks to AI, your evidence will get thrown away unless it’s certified to originate from a genuine device and hasn’t been tampered with. Also expect a deluge of fake footages to pop up.
“it’s not an MBA thing, it’s a technical progress thing”
proceeds to describe how MBAs (Samsung marketers and business leaders) are doing this with technology
again with the acting like i disagree with you? lol
Non-MBAs are already using the same technology for deep fake porn, including fake revenge porn, or to blackmail and bully their school classmates.
You seem to blame it on businesses, like Samsung, which is what I disagree with. All those MBAs are just desperately trying (and failing) to anticipate regulations caused by average people, that will be way stricter than what even you might want.