With over 2.6 billion monthly active users, YouTube is a highly coveted platform for malicious actors to target. On Monday, researchers in the field of cybersecurity reported a jaw-dropping surge of 200%-300% in the number of YouTube videos that include links to malicious software designed to pilfer confidential financial information from computer systems.
Malware, referred to as “Infostealers,” is disseminated through malicious downloads, counterfeit websites, and YouTube tutorials. Once they infiltrate a system, they pilfer information sent to the hackers’ Command and Control server.
As per the research findings, YouTube receives a staggering influx of 5-10 videos per hour that feature illicit software downloads and dangerous links. It’s important to note that, as of June 2022, approximately 30,000 hours of newly uploaded content are uploaded on the video platform every hour.
The videos employ deceptive techniques to lure users into downloading malware, which complicates the task of detection and deletion for the YouTube algorithm written to fight against such malicious links.
The research team found various types of stealer malware, including Vidar, RedLine, and Raccoon, in YouTube videos dating back to November 2022. These malicious software programs are capable of stealing sensitive information such as passwords, credit card details, bank account numbers, and other confidential data.
According to the report, the videos disguise themselves as instructional guides for downloading pirated editions of licensed software, including Adobe Photoshop, Premiere Pro, Autodesk 3ds Max, AutoCAD, and many more that are exclusively available to paying subscribers.
But that’s not all!
To make the video appear more credible and viral, threat actors frequently post fake comments to influence the YouTube algorithm that works in the background to identify which videos have the potential to go viral.
The report also highlighted that these comments are intended to deceive users into thinking that the malware is legitimate. Additionally, threat actors are adopting a new trend of using AI-generated videos that feature personas that appear more familiar and trustworthy to further their objectives.
“In a concerning trend, these threat actors are now utilizing AI-generated videos to amplify their reach, and YouTube has become a convenient platform for their distribution,” said Pavan Karthick, a CloudSEK researcher.
YouTube Scammers Employe Smart Tactics
In the era of deep penetration of high-speed mobile internet, the popularity of video platforms like YouTube has skyrocketed. Unfortunately, it has also attracted the eyeballs of scammers as well.
These scams, however, are not confined to YouTube alone! One such example is LinkedIn.
With the rise in popularity of LinkedIn, it has also become a prime target for fraudsters. LinkedIn’s reputation as a reliable professional networking platform makes it an attractive target for scammers who exploit unsuspecting users. They employ a range of tactics, including generating fraudulent profiles, sending phishing messages, and advertising job offers that are too good to be true.
OpenAI’s ChatGPT is another example of such misuse. Scammers are capitalizing on the recent success of this AI-powered chatbot by using its name to deceive unsuspecting internet users. Cybersecurity experts have recently discovered numerous fake ChatGPT websites that disseminate malware and pilfer sensitive information from internet users. These websites entice visitors to download ChatGPT as a local application for Windows, ultimately injecting the RedLine malware into their system. Scammers are also penetrating the Google Play Store with more than 50 counterfeit ChatGPT apps and utilizing phishing campaigns to steal valuable user data.
Advanced techniques, such as creating Skype profiles with photos of genuine recruiters from reputable companies, are being employed by scammers to conduct fake interviews, according to Deepen Desai, Zscaler’s Vice President of Security Research. Moreover, they are utilizing Artificial Intelligence (AI) to produce profile images that can easily dupe humans.
Artificial Intelligence (AI) has undeniably revolutionized the technological landscape, making our lives more convenient and comfortable. Nevertheless, there is growing apprehension that AI is becoming increasingly perilous for humans, and fraudsters are exploiting the technology to deceive individuals.
Safety Measures
Do not click on any links in the video description or in the video itself that you are not familiar with. Scammers often use fake links to redirect users to malicious websites that can steal their data.
Furthermore, ensure that the video is posted by a reputable source before clicking on it. Scammers often create fake channels to post videos that contain malware.
Always install antivirus software on your computer and keep it up to date. This can help detect and remove any malware that may be present on your system.
Moreover, scammers may send phishing emails or messages that appear to be from YouTube or other reputable sources. Do not respond to these messages or click on any links in them.
Also, ensure that your operating system, web browser, and other software are all up to date with the latest security patches. This can help protect you from known vulnerabilities that scammers may exploit.
And, if a pop-up appears while watching a YouTube video, do not click on it. Scammers often use pop-ups to trick users into downloading malware or giving away their personal information.
Bottom Line
Overall, the alarming surge in malware-infected YouTube videos is a stark reminder of the need to remain vigilant and exercise caution while browsing online. Users must take every step to protect themselves and their data by staying informed about potential threats and using reliable antivirus software. Furthermore, it is essential for YouTube and other online platforms to prioritize user safety and take proactive measures to prevent the spread of malicious content. By working on users’ safety, they can create a safer and more secure online environment for all.