This is part three of a three-part series written by AT&T Cybersecurity evangelist Theresa Lanowitz. It’s intended to be future-looking, provocative, and encourage discussion. The author wants to assure you that no generative AI was used in any part of this blog.
Part one: Unusual, thought-provoking predictions for cybersecurity in 2024
Part two: Cybersecurity operations in 2024: The SOC of the future
While there are many big things to prepare for in 2024 (see first two posts), some important smaller things don’t get the same attention. Yet, these things are good to know and probably won’t come as a huge surprise. Because they, too, are evolving, it’s important not to take your eye off the ball.
Compliance creates a new code of conduct and a new need for compliance logic.
Compliance and governance are often overlooked when developing software because a different part of the business typically owns those responsibilities. That is all about to change. Cybersecurity policies (internal and external, including new regulations) need to move upstream in the software development lifecycle and need compliance logic built in to simplify the process. Software is designed to work globally; however, the world is becoming more segmented and parsed. Regulations are being created at country, regional, and municipal levels. To be realistic, the only way to handle compliance is via automation.
To avoid the constant forking of software, compliance logic will need to be a part of modern applications. Compliance logic will allow software to function globally but adjust based on code sets that address geographic locations and corresponding regulations.
In 2024, expect compliance logic to become a part of the larger conversation regarding compliance, governance, regulation, and policy. This will require cross-functional collaboration across IT, security, legal, line of business, finance, and other organizational stakeholders.
MFA gets physical.
Multi-factor authentication (MFA) is a way of life. The benefits far outweigh the slight inconvenience imposed. Think about why MFA is so critical. MFA helps with authorization and authentication for mission-critical and safety-critical work. It prevents unauthorized access to critical information. MFA is an easy-to-implement step for good cyber hygiene.
Our current way of thinking about MFA is generally based on three things: something you know, a passcode; something you have, a device; and something you are, a fingerprint, your face, etc.
Now, let’s take this a step further and look at how the something you are part of MFA can improve safety. Today, MFA routinely accepts fingerprints, facial recognition, or retina scans. That’s just the beginning. MFA can go a step further in helping with business outcomes; here’s how.
Biometric and behavioral MFA can help with identifying the veracity of an individual as well as the fitness to perform a function. For example, a surgeon can access the hospital, restricted areas, and the operating room through MFA verifications.
But, once in the operating room, how is it determined that the surgeon is fit to perform the surgical task? Behavioral MFA will soon be in play to ensure the surgeon is fit by adding another layer of something you are. Behavioral MFA will determine fitness for a task by identifying things such as entering a series of numbers on a keypad, handwriting on a tablet, or voice analysis. The goal is to compare current behavior with past behavior to ensure no cognitive compromise.
In 2024, expect to see more discussion of expanding MFA and the something you are aspect to include fitness for a task. This is an outstanding bit of innovation that will continue to evolve our digital world.
Beef-up your AI lexicon.
This blog would be remiss without mentioning AI. In 2023, AI became a media sweetheart because of the broad use of generative AI for everything from writing term papers to marketing materials to legal briefs. The lowest common denominator of AI usage was released. However, generative AI has struggles with hallucination (creating non-sensical or inaccurate output because of pattern matching in a large language model), collapse (the generation of repetitive output because of data limitations), and a garbage-in-garbage-out irony.
Generative AI will impact social engineering and make phishing, squishing (the new phishing using QR codes), and smishing (sending counterfeit text messages) more difficult to detect. Intentionally malicious code may be more difficult to detect and, in some cases, may be integrated into legitimate source code branches. All of this means we must be more aware and vigilant.
Machine learning has long been a tool for data scientists, security researchers, and threat intelligence teams. The technology is superb at scanning large data sets and pattern matching.
Next up in the AI frenzy is something that few are discussing: deep learning. Deep learning is about producing predictions based on complexities in data. This can help in predicting a threat before it happens. Deep learning models have a large enough dataset to use past observations to predict future activity.
In 2024, expect deep learning to enter the cybersecurity conversation to take the industry to places that machine learning can’t take us. More data and more observations help hone future predictions.
Social engineering is still hard to beat
Despite the technology we have to protect our networks, applications, and data, the human element is still the weakest link. And social engineering is a major contributor to security events. Business email compromise was the second most common attack concern in our 2023 research. Compromised credentials can easily allow a bad actor access to the digital kingdom.
Stolen or compromised credentials are a treasure trove for social engineers. Bad actors can use inexpensive technology to spoof voices and gain access to accounts or be given access to credentials. Social engineers prey on emotions and always want the target to act out of a sense of urgency. Frequent social engineering tactics will include phrases such as “I’m rushing to get on a plane and need this right now,” “your family member needs this cash right now,” or “your family/friend is in danger and needs your help”. Being alert and aware are the best ways to counteract these social engineering scams. As cybersecurity professionals, we need to talk to our colleagues, friends, and family about the tactics of social engineers.
In 2024, unfortunately, expect social engineering tactics to continue to evolve and reap payouts from unsuspecting people. Being a cybersecurity ambassador can go a long way to helping the public understand what social engineering is and how to avoid it.
Looking ahead
A new year is always exciting and moving into 2024 is no exception. Technology continues to surprise and delight us.
The time is ripe for innovation, and we were treated to a glimpse of the future in 2023.
Looking ahead, 2024 is the year of the business understanding security and security starting to understand the business.
Here’s to a year of innovation!
More Stories
CISA’s 2024 Review Highlights Major Efforts in Cybersecurity Industry Collaboration
The US Cybersecurity and Infrastructure Security Agency’s 2024 Year in Review marks Jen Easterly’s final report before resignation Read More
Casino Players Using Hidden Cameras for Cheating
The basic strategy is to place a device with a hidden camera in a position to capture normally hidden card...
Friday Squid Blogging: Squid on Pizza
Pizza Hut in Taiwan has a history of weird pizzas, including a “2022 scalloped pizza with Oreos around the edge,...
Scams Based on Fake Google Emails
Scammers are hacking Google Forms to send email to victims that come from google.com. Brian Krebs reports on the effects....
Infostealers Dominate as Lumma Stealer Detections Soar by Almost 400%
The vacuum left by RedLine’s takedown will likely lead to a bump in the activity of other a infostealers Read...
The AI Fix #30: ChatGPT reveals the devastating truth about Santa (Merry Christmas!)
In episode 30 of The AI Fix, AIs are caught lying to avoid being turned off, Apple’s AI flubs a...