Written by 11:48 AM World

Telegram CEO suggests solutions, but they are useless in preventing the distribution of illegal synthetic substances.

Proposal to delete some functions, criticism of half-hearted measures
“Developing new methods even if specific functions are blocked”
,

Pavel Durov, CEO of Telegram. AP News

Pavel Durov, CEO of Telegram. AP News,
, ‘Pavel Durov, the CEO of Telegram, who is under investigation without detention for allegedly conspiring to distribute illegal material, including child exploitation material on Telegram, has proposed a plan to improve by deleting some functions that have sparked controversy. However, criticism has emerged that this is a ‘half-hearted’ solution that does not help prevent widespread digital crimes within the platform.’,
,
, ‘On the 6th, Durov posted on his ‘X’ (formerly Twitter) account, stating, “(on Telegram) the ‘People Nearby’ function and the photo/video upload function within the anonymous blog service ‘Telegraph’ are no longer available for use.” He claimed, “99.999% of Telegram users have nothing to do with crime, but the 0.001% involved in illegal activities have tarnished the platform’s image by misusing these functions, posing a threat to the benefits of nearly 1 billion users.”‘,
,
,

Telegram logo. Reuters

Telegram logo. Reuters,
, ‘While it is true that the two functions Telegram decided to delete have been misused for various criminal activities, they are not significantly related to recent digital crimes such as deep fake content dissemination. The ‘People Nearby’ function allows users to create chat rooms with other nearby Telegram users based on their smartphone’s location information.’,
,
, ‘Concerns have been raised that this function increases stalking risks by sharing location information with unintended individuals or exposes them to indiscriminate financial fraud information sent by automated bots. The anonymous blog service ‘Telegraph’ allows anyone to post articles anonymously, share links to fake websites, and has been mainly exploited for phishing scams to extract personal information. However, neither of these functions is directly associated with the production and distribution of illegal synthetic materials.’,
,
, ‘Experts argue that blocking only certain functions will not eradicate digital crimes. When specific functions are blocked, malicious users can easily bypass them by developing new methods or moving to other platforms with fewer restrictions to repeat the same activities.’,
,
, ‘Tech-feminist activist Jo Kyung-sook, who has been actively reporting and exposing Telegram bots involved in creating deep fake exploitation material, explained using the example of a ‘bot sharing bot.’ “If Telegram blocks a specific bot, malicious users quickly create a similar bot. They then move around using a ‘bot sharing bot’ to promote the new bot’s link and continue to engage in the same criminal activities.”‘,
,
, ‘On the 3rd, a Telegram channel with 210,000 subscribers, discovered and reported by Jo, had notification messages from a ‘bot sharing bot’ containing phrases such as “We’re always here. Come back with a new bot” and links to new bots in multiple languages including Korean, Japanese, German, and French.’,
,
, ‘Jo emphasized the urgency of instilling in malicious users the perception that “they will definitely be caught if they commit crimes.” Park Ji-hyun, former emergency measure chairman of the Democratic Party of Korea who first exposed the Telegram-based child and adolescent sexual exploitation crime of ‘nth room’ while working with the ‘Flame Brigade’ in 2019, pointed out, “If Telegram is willing to cooperate with Korean authorities in investigations, they should proactively share specific information such as which IP addresses the offenders accessed with, how they signed up, etc., to law enforcement agencies in order to make malicious users aware that they will be caught.”‘,
,
, ‘Won Eun-ji, an activist at the ‘Flame Brigade,’ also stated, “Rather than just blocking specific functions, Telegram should monitor and take responsibility for the distribution of child and adolescent sexual exploitation material within the platform, like major IT companies such as Google, by appointing an ethics supervisor and actively assisting law enforcement agencies in understanding situations within the platform. She further urged for technological measures such as automatically deleting images or videos that match a certain level of illegal content, or blocking uploads in advance.’,
,
,

Visited 2 times, 1 visit(s) today
Close Search Window
Close