It seems that everyone is rushing to embed artificial intelligence into their solutions, and security offerings are among the latest to obtain this shiny new thing. Like many, I see the potential for AI to help bring about positive change, but also its potential as a threat vector.
To some, recent AI developments are a laughing matter. On April 1, 2023, that traditional day when technology and social media sites love to pull a fast one on us and engage in often elaborate pranks, the Twitter account for the MITRE ATT&CK platform launched the #attackgpt Twitter bot, which invited users to employ the hashtag #attackgpt, which would generate an “AI” response to questions about the anti-hacker knowledge base. In reality, it was an April fool’s prank with MITRE’s social media team cranking out funny answers in the guise of a chatbot.
To read this article in full, please click here
More Stories
Clever Social Engineering Attack Using Captchas
This is really interesting. It’s a phishing attack targeting GitHub users, tricking them to solve a fake Captcha that actually...
US Cyberspace Solarium Commission Outlines Ten New Cyber Policy Priorities
In its fourth annual report, the US Cyberspace Solarium Commission highlighted the need to focus on securing critical infrastructure and...
Cybersecurity Skills Gap Leaves Cloud Environments Vulnerable
A new report by Check Point Software highlights a significant increase in cloud security incidents, largely due to a lack...
Going for Gold: HSBC Approves Quantum-Safe Technology for Tokenized Bullions
The bank giant and Quantinuum trialed the first application of quantum-secure technology for buying and selling tokenized physical gold Read...
This Windows PowerShell Phish Has Scary Potential
Many GitHub users this week received a novel phishing email warning of critical security holes in their code. Those who...
Infostealers Cause Surge in Ransomware Attacks, Just One in Three Recover Data
Infostealer malware and digital identity exposure behind rise in ransomware, researchers find Read More