Change language

The European Commission proposed to automatically scan private correspondence to protect children

The European Commission to Combat Online Child Sexual Abuse has proposed requiring companies such as Meta and Apple to implement systems for scanning child sexual abuse content on their platforms. It is assumed that the personal correspondence of users will be checked by an artificial intelligence system, and then an authorized employee will study it.

The EC release states that in 2021 alone, 85 million photos and videos were recorded in the world, which depict the sexual abuse of minors. Compared to 2020, the number of confirmed cases of sexual exploitation of children increased by 64%. These estimates were obtained on the basis of data that were voluntarily transmitted by services specializing in erotic and pornographic content.

The EC believes that the new law should apply to all online services offering services in the EU: instant messengers, Internet providers, app stores and hosting.

The European Commission proposes to create a database CSAM (Child Sexual Abuse Material, "materials describing the sexual abuse of children"). This is a database of specific images that child protection organizations have collected, photos from which or similar ones will be searched in social networks and digital storage sites. CSAM detection technologies are usually based on hashing, that is, each image in the database is assigned a unique digital fingerprint. A collection of such hashes is uploaded to the platform or device. Photos that users send to each other in conversations or upload to the cloud also receive their own hash. If an image with child pornography appears in the messages, artificial intelligence will highlight it in the appropriate category and compare the code with those already uploaded. If the hashes match, the image will be checked by a company employee.

Companies will then have to report the discovered image to a specially created EU Center. National authorities may require the removal of content if the files were not quickly removed by the platforms themselves, or require providers to disable access to photos and videos hosted outside the EU.

If a company breaks the rules, it will face a fine of up to 6% of global annual turnover.

However, critics of the idea say that the law will effectively deprive millions of people of their privacy and security. They also indicate a high probability of false identification.

Kaspersky Lab experts note: “The claims of law enforcement agencies about the difficulties in catching criminals and collecting evidence due to the widespread introduction of encryption are understandable. Concerns about massive digital surveillance are also clear. The problem is that there is not yet a social contract that balances security and privacy.”

The EC promises to prevent the misuse of technology and will encourage only the least privacy-infringing options. It will most likely take several hits to decrypt the hash. Thus, already known CSAM materials, new CSAM content, as well as messages containing grooming will fall under the law. However, this involves parsing text messages, which seems to be an even less transparent and precise procedure than searching for CSAM images.

“The frightening thing is that once you use the machines that read your text messages for whatever purpose, the rest of the restrictions simply disappear,” says cryptography professor Matthew Green.

Earlier in 2021

In July 2021, the European Parliament approved a law allowing companies to scan emails for child sexual abuse without fear of violating the GDPR.

In early August, Financial Times journalists found out that Apple intends to build a backdoor into its iPhones that will allow smartphones to be checked for child abuse photos. The system will be called neuralMatch. The company was sharply criticized by cybersecurity experts - according to them, it is dangerous with total surveillance of device users. In September, it was reported that Apple had shelved its plans to scan user photos.

Meanwhile, the UK government has agreed with Apple’s initiative and said it wants to be able to inspect encrypted messages even if end-to-end encryption is used.

Michael Zippo
2022/05/16

https://linkedin.com/in/michael-zippo-9136441b1
[email protected]

Sources:
tutanota.com, Financial Times

The European Commission proposed to automatically scan private correspondence to protect children News: Questions

Shop

Best laptop for Fortnite

$

Best laptop for Excel

$

Best laptop for Solidworks

$

Best laptop for Roblox

$

Best computer for crypto mining

$

Best laptop for Sims 4

$

Best laptop for Zoom

$499

Best laptop for Minecraft

$590

Latest questions

NUMPYNUMPY

psycopg2: insert multiple rows with one query

12 answers

NUMPYNUMPY

How to convert Nonetype to int or string?

12 answers

NUMPYNUMPY

How to specify multiple return types using type-hints

12 answers

NUMPYNUMPY

Javascript Error: IPython is not defined in JupyterLab

12 answers

News

Wiki

Python OpenCV | cv2.putText () method

numpy.arctan2 () in Python

Python | os.path.realpath () method

Python OpenCV | cv2.circle () method

Python OpenCV cv2.cvtColor () method

Python - Move item to the end of the list

time.perf_counter () function in Python

Check if one list is a subset of another in Python

Python os.path.join () method