(ORDO NEWS) — A California AI artist saw her medical photos taken by her doctor in 2013 in a list of links in the LAION-5B algorithm. It was only the beginning – it turned out that there are many such pictures.
The artist found the photos on a site called Have I Been Trained, which allows artists to see if their work is in the LAION-5B artificial intelligence dataset.
However, instead of doing a text search on the site, the woman uploaded a recent photo of herself using the site’s reverse image search feature.
She was surprised to find a set of two pictures of her face that were only allowed for use by a doctor.
What is known
Lapin suffers from a genetic condition that required her to undergo multiple surgeries on her face, mouth and jaw in 2013.
These photos are from my last procedure with this surgeon, ” the woman says.
The surgeon who owned the medical photographs died of cancer in 2018, Lapin said, and she suspects they somehow left his office after that. The artist calls this situation “the digital equivalent of receiving stolen property.”
Someone stole the images from my deceased doctor’s files and they ended up somewhere on the Internet and then added to this dataset, the victim adds.
Ars conducted its own image search experiments and was able to identify both photographs of her and thousands of similar images, all of which likely have the same dubious ethical and legal status.
How it happened and what can be done about it
Artificial intelligence does not work randomly. In order for it to perform the tasks assigned to it, the creators must first teach the algorithm on a large data set.
Therefore, such developments are usually “fed” with huge archives of information – texts, photographs, video, audio. However, according to LAION, in their case, everything works a little differently.
LAION describes itself as a non-profit organization with members around the world, “committed to making large-scale machine learning models, datasets, and related code available to the general public.”
Its data can be used in various projects, from face recognition to computer vision and image synthesis. For example, the technology can generate images from text descriptions.
However, the AI itself is a set of URLs pointing to images on the web, LAION does not host the images themselves.
In such conditions, the responsibility for including this or that image in the LAION set turns into a bizarre game of shifting this responsibility. Lapin’s friend posted an open question on the #safety-and-privacy channel on the LAION Discord server, asking how to remove her image.
LAION engineer Romain Beaumont replied: “The best way to remove an image from the internet is to ask the website to stop posting it. We don’t post any of those images.”
Thus, the developers explain that they do not keep all these images in the memory of the AI, but collect them in real time from the Internet.
This means that medical photos of Lapin and thousands of other people are in the public domain on one of the sites.
Contact us: [email protected]