Security Incident Impacts CardioComm’s Operations

Read Time:3 Second

Several of the company’s products are affected by the outage

Read More

SEC requires firms to report cyberattacks within 4 days, but not everyone may like it

Read Time:19 Second

New rules requiring publicly-listed firms to disclose serious cybersecurity incidents within four days have been adopted by the US Securities and Exchange Commission (SEC).

The tough new rules, although undoubtedly well-intentioned, are likely to leave some firms angry that they being “micromanaged” and – it is argued – could even assist attackers.

Read more in my article on the Tripwire State of Security blog.

Read More

USN-6260-1: Linux kernel vulnerabilities

Read Time:2 Minute, 6 Second

It was discovered that the NTFS file system implementation in the Linux
kernel did not properly check buffer indexes in certain situations, leading
to an out-of-bounds read vulnerability. A local attacker could possibly use
this to expose sensitive information (kernel memory). (CVE-2022-48502)

Stonejiajia, Shir Tamari and Sagi Tzadik discovered that the OverlayFS
implementation in the Ubuntu Linux kernel did not properly perform
permission checks in certain situations. A local attacker could possibly
use this to gain elevated privileges. (CVE-2023-2640)

It was discovered that the IP-VLAN network driver for the Linux kernel did
not properly initialize memory in some situations, leading to an out-of-
bounds write vulnerability. An attacker could use this to cause a denial of
service (system crash) or possibly execute arbitrary code. (CVE-2023-3090)

Mingi Cho discovered that the netfilter subsystem in the Linux kernel did
not properly validate the status of a nft chain while performing a lookup
by id, leading to a use-after-free vulnerability. An attacker could use
this to cause a denial of service (system crash) or possibly execute
arbitrary code. (CVE-2023-31248)

It was discovered that the Ricoh R5C592 MemoryStick card reader driver in
the Linux kernel contained a race condition during module unload, leading
to a use-after-free vulnerability. A local attacker could use this to cause
a denial of service (system crash) or possibly execute arbitrary code.
(CVE-2023-3141)

Shir Tamari and Sagi Tzadik discovered that the OverlayFS implementation in
the Ubuntu Linux kernel did not properly perform permission checks in
certain situations. A local attacker could possibly use this to gain
elevated privileges. (CVE-2023-32629)

Querijn Voet discovered that a race condition existed in the io_uring
subsystem in the Linux kernel, leading to a use-after-free vulnerability. A
local attacker could use this to cause a denial of service (system crash)
or possibly execute arbitrary code. (CVE-2023-3389)

It was discovered that the netfilter subsystem in the Linux kernel did not
properly handle some error conditions, leading to a use-after-free
vulnerability. A local attacker could use this to cause a denial of service
(system crash) or possibly execute arbitrary code. (CVE-2023-3390)

Tanguy Dubroca discovered that the netfilter subsystem in the Linux kernel
did not properly handle certain pointer data type, leading to an out-of-
bounds write vulnerability. A privileged attacker could use this to cause a
denial of service (system crash) or possibly execute arbitrary code.
(CVE-2023-35001)

Read More

Fooling an AI Article Writer

Read Time:40 Second

World of Warcraft players wrote about a fictional game element, “Glorbo,” on a subreddit for the game, trying to entice an AI bot to write an article about it. It worked:

And it…worked. Zleague auto-published a post titled “World of Warcraft Players Excited For Glorbo’s Introduction.”

[…]

That is…all essentially nonsense. The article was left online for a while but has finally been taken down (here’s a mirror, it’s hilarious). All the authors listed as having bylines on the site are fake. It appears this entire thing is run with close to zero oversight.

Expect lots more of this sort of thing in the future. Also, expect the AI bots to get better at detecting this sort of thing. It’s going to be an arms race.

Read More

USN-6259-1: Open-iSCSI vulnerabilities

Read Time:29 Second

Jos Wetzels, Stanislav Dashevskyi, and Amine Amri discovered that
Open-iSCSI incorrectly handled certain checksums for IP packets.
An attacker could possibly use this issue to expose sensitive information.
(CVE-2020-13987)

Jos Wetzels, Stanislav Dashevskyi, Amine Amri discovered that
Open-iSCSI incorrectly handled certain parsing TCP MSS options.
An attacker could possibly use this issue to cause a crash or cause
unexpected behavior. (CVE-2020-13988)

Amine Amri and Stanislav Dashevskyi discovered that Open-iSCSI
incorrectly handled certain TCP data. An attacker could possibly
use this issue to expose sensitive information. (CVE-2020-17437)

Read More

What your peers want to know before buying a DLP tool

Read Time:4 Minute, 31 Second

The content of this post is solely the responsibility of the author.  AT&T does not adopt or endorse any of the views, positions, or information provided by the author in this article. 

Preventing data loss is a concern for almost every organization, regardless of size, especially organizations with sensitive data.  Organizations, now more than ever before, rely on voluminous amounts of data to conduct business. When data leakage or a breach occurs, the organization is forced to deal with the negative consequences, such as the high cost associated with data breach fines and remediation and reputational harm to their company and brand. 

Data loss prevention (DLP) solutions help mitigate the risk of data loss. Losses can occur as a result of insider-related incidents (e.g., employee theft of proprietary information), or due to physical damage to computers, or as a result of human error (e.g., unintentional file deletion or sharing sensitive data in an email). In addition to the various ways an organization might experience data loss, mitigating the risk of loss requires the right people, processes, and technology.

Meeting the technology requirement can be a challenge when it comes to selecting the right DLP solution. During the vendor exploration and evaluation phases, there may be questions about whether it makes sense to invest in a solution that protects the network, endpoints, or the cloud or whether it’s better to select a solution that protects the enterprise and takes into account the hybrid nature of many organizations.

Data classification and labeling

The decision to invest in a DLP solution should be informed by sufficient research and planning with key stakeholders. This blog will discuss three additional things you should consider before making such an investment. Let’s begin with the types of data an organization collects, stores, and analyzes to conduct business. 

To have a successful data loss prevention program, it’s important to identify all types of data (e.g., financial data, health data, or personally identifiable information) and to classify the data according to its value and the risk to the organization if it is leaked or exfiltrated. Data classification is the process of categorizing data to easily retrieve and store it for business use. It also protects it from loss and theft and enables regulatory compliance activities. Today, systems are more dispersed, and organizations have hybrid and remote workforce models, so it is critical to protect data regardless of where it resides or with whom it is shared. This kind of protection requires properly classified and labeled data.

Automated data classification is foundational to preventing data loss. It is the best way for organizations to fully understand what types of data they have, as well as the characteristics of the data and what privacy and security requirements are necessary to protect the data. Properly classifying data also enables the organization to set policies for each data type.

Techniques to identify sensitive data

DLP solutions detect instances of either intentional or unintentional exfiltration of data. DLP policies describe what happens when a user uses sensitive data in a way the policy does not allow. For example, when a user attempts to print a document containing sensitive data to a home printer, the DLP policy might display a message stating that printing the document to a home printer violates the policy and is not permissible. How does the DLP tool know that the document includes sensitive data? Content inspection techniques and contextual analysis help identify sensitive data. 

The inspection capability of the DLP solution is very important. It’s important to note that traditional DLP solutions focus on data-specific content inspection methods. These inspection methods are no longer effective for organizations that have migrated to the cloud because the techniques were developed for on-premises environments. Gartner recommends investing in a DLP solution that not only provides content inspection capabilities but also offers extra features such as data lineage for visibility and classification, user, and entity behavior analytics (UEBA), and rich context for incident response. UEBA is useful for insider-related incidents (e.g., UEBA might help identify data exfiltration by a dissatisfied employee). 

What actions will the DLP solution perform

After it’s clear that the tool can classify sensitive data, a logical next question is what actions the tool will perform to prevent loss of that data. A DLP solution performs actions such as sending out alerts for DLP policy violations, warnings using pop-up messages, and blocking data entirely to prevent leakage or exfiltration. Another feature might include quarantining data. Organizations should be able to define their policies based on their policy, standards, controls, and procedures. 

Traditional DLP relies heavily on content analysis and does not always accurately identify sensitive data. Sometimes traditional tools block normal activity. In contrast, a modern DLP solution minimizes false positives by combining content analysis and data lineage capabilities to more accurately understand whether the data is sensitive.    

Conclusion   

There are many DLP tools on the market. A DLP solution might also be a capability in another security tool such as an email security solution. Selecting the right tool requires knowledge of market trends, the gap between traditional and modern DLP tools, data loss prevention best practices, and the purchasing organization’s security initiatives and goals. Given the many options and variables to consider, it can be challenging to understand the nuances and distinctions among solutions on the market.    

Read More