[Full Disclosure] CVE-2024-22903: Unpatched Command Injection in Vinchin Backup & Recovery Versions 7.2 and Earlier

Read Time:23 Second

Posted by Valentin Lobstein via Fulldisclosure on Jan 26

CVE ID: CVE-2024-22903

Title: Command Injection Vulnerability in SystemHandler.class.php of Vinchin Backup & Recovery Versions 7.2 and Earlier

Description:
A significant security vulnerability, CVE-2024-22903, has been identified in the `deleteUpdateAPK` function within the
`SystemHandler.class.php` file of Vinchin Backup & Recovery software, affecting versions 7.2 and earlier. This
function, designed to delete APK files, is prone to…

Read More

[Full Disclosure] CVE-2024-22902: Default Root Credentials in Vinchin Backup & Recovery v7.2 and Earlier

Read Time:22 Second

Posted by Valentin Lobstein via Fulldisclosure on Jan 26

CVE ID: CVE-2024-22902

Title: Default Root Credentials Vulnerability in Vinchin Backup & Recovery v7.2

Suggested Description:
Vinchin Backup & Recovery version 7.2 has been identified as being configured with default root credentials, posing a
significant security vulnerability.

Additional Information:
There is no documentation or guidance from Vinchin on changing the root password for this version. The use of password
authentication…

Read More

[Full Disclosure] CVE-2024-22899: Unpatched Command Injection in Vinchin Backup and Recovery Versions 7.2 and Earlier

Read Time:22 Second

Posted by Valentin Lobstein via Fulldisclosure on Jan 26

CVE ID: CVE-2024-22899

Title: Command Injection Vulnerability in Vinchin Backup and Recovery’s syncNtpTime Function in Versions 7.2 and Earlier

Description:
A critical security vulnerability, identified as CVE-2024-22899, has been discovered in the `syncNtpTime` function of
Vinchin Backup and Recovery software. This issue affects versions 7.2 and earlier. The function, part of the
`SystemHandler.class.php` file, is designed for…

Read More

[Full Disclosure] CVE-2024-22900: Unpatched Command Injection in Vinchin Backup and Recovery Versions 7.2 and Earlier

Read Time:22 Second

Posted by Balgogan via Fulldisclosure on Jan 26

CVE ID: CVE-2024-22900

Title: Command Injection Vulnerability in Vinchin Backup and Recovery Versions 7.2 and Earlier

Description:
A critical security vulnerability, identified as CVE-2024-22900, has been discovered in Vinchin Backup and Recovery
software, affecting versions 7.2 and earlier. The vulnerability is present in the `setNetworkCardInfo` function, which
is intended to update network card information.

Details:
1. The function…

Read More

[SBA-ADV-20200707-02] CVE-2020-36772: CloudLinux CageFS 7.0.8-2 or below Insufficiently Restricted Proxy Command

Read Time:21 Second

Posted by SBA – Advisory via Fulldisclosure on Jan 26

# CloudLinux CageFS Insufficiently Restricted Proxy Command #

Link:
https://github.com/sbaresearch/advisories/tree/public/2020/SBA-ADV-20200707-02_CloudLinux_CageFS_Insufficiently_Restricted_Proxy_Commands

## Vulnerability Overview ##

CloudLinux CageFS 7.0.8-2 or below insufficiently restricts file paths
supplied to the `sendmail` proxy command. This allows local users to read
and write arbitrary files of certain file formats outside the…

Read More

[SBA-ADV-20200707-01] CVE-2020-36771: CloudLinux CageFS 7.1.1-1 or below Token Disclosure

Read Time:22 Second

Posted by SBA – Advisory via Fulldisclosure on Jan 26

# CloudLinux CageFS Token Disclosure #

Link: https://github.com/sbaresearch/advisories/tree/public/2020/SBA-ADV-20200707-01_CloudLinux_CageFS_Token_Disclosure

## Vulnerability Overview ##

CloudLinux CageFS 7.1.1-1 or below passes the authentication token as a
command line argument. In some configurations this allows local users to
view the authentication token via the process list and gain code execution
as another user.

* **Identifier**…

Read More

Protect What Matters on Data Privacy Day

Read Time:4 Minute, 14 Second

Imagine a “Privacy Facts” label on the apps, devices, and websites you use. Like a digital version of the “Nutrition Facts” on the sides of your cereal boxes and other food you buy. With a quick look, you could see what the company behind that app, device, or website collects — and what they do with it. 

Sadly, no such label exists. The fact of privacy today is that it takes work to uncover how the apps, devices, and websites you use collect your personal data and info.  

To uncover those details, you’ll find yourself wading through privacy policies, which are known for their thick legalese. And they can get rather vague. Words like “may” and “might” leave the door open for what companies really do with the personal info and data they collect. They “may” share it with other parties and they “might” sell it to other parties as well.  

Meanwhile, those other parties “may” or “might” use it for their own purposes. Other parties that are largely unknown to you, if not completely unknown, because they’re undisclosed. 

As a result, once your personal data and info gets out there, it has a way of getting around. 

Data and info collection powers the internet, which counts as yet one more fact of privacy. Yet that collection has its legal and ethical boundaries. And those boundaries stand front and center once again this Data Privacy Day.  

Data Privacy Day gives us a chance to consider the importance of respecting privacy, of protecting data, and of building trust. Particularly on the internet, where data is the coin of the realm. It holds great value. Companies want it to improve their services and marketing. Bad actors want it to commit fraud and theft — or sell it on dark marketplaces. 

Your clutch of personal data and info has a price tag hanging on it. That makes it worth protecting. 

Granted, we think about privacy every day. The value it has. The importance of protecting it. And how we can make that protection it stronger and easier for you. That’s very much on our minds in a time where people say they have little idea about what personal data and info gets collected.  

Indeed, plenty of people are scratching their heads about their privacy online. Findings from Pew Research in 2023 showed that roughly three-quarters of Americans surveyed said they feel like they have little or no control over data collectioni. Moreover, 67% of them said they understand little to nothing about what companies are doing with their personal data. That’s up 8% from 59% in 2019ii. 

In four short years, more people feel like protecting their privacy is out of their hands. Even the ripple effects of the European Union’s General Data Protection Regulation (GDPR)iii and strong consumer privacy laws in a dozen or so U.S. statesiv haven’t increased their confidence. Only 61% of Americans feel that anything they do will make much difference when it comes to managing their privacy onlinev 

Yet something else has happened in those four years. Online protection software has become more powerful. Particularly when it comes to privacy. Even if things feel otherwise, you truly can take significant steps that make a difference in your privacy. 

As far as our online protection software goes, it offers several simple and powerful ways to protect your privacy. McAfee+ features Personal Data Cleanup and Online Account Cleanup — two ways you can take control of your data and info. With them, you can: 

Remove your data and info from risky data broker sites.  
Also remove your data and info from old accounts, which makes them one less target for a data breach. 

Further, McAfee+ rounds things out with our VPN. That keeps you anonymous from advertisers and other data collectors, all while securing you from other prying eyes online. 

Those handful of features, part of your overall identity and virus protection, can make you far more private. Even in a time of opaque privacy policies and heavy data collection online. Once again, our aim is to make that simple and powerful for you. 

It really is too bad there’s not a label for privacy. Sure, it’d be nice if you could peer into the Privacy Facts of the apps, devices, and websites you use. But the good news is that online protection software can put you in control of your personal data and info without those details. You truly are in more charge of your privacy than you might feel nowadays. 

[i] https://www.pewresearch.org/internet/2023/10/18/views-of-data-privacy-risks-personal-data-and-digital-privacy-laws/

[ii] https://www.pewresearch.org/internet/2023/10/18/how-americans-view-data-privacy/

[iii] https://gdpr.eu/what-is-gdpr/

[iv] https://pro.bloomberglaw.com/brief/state-privacy-legislation-tracker/

[v] https://www.pewresearch.org/internet/2023/10/18/views-of-data-privacy-risks-personal-data-and-digital-privacy-laws/

The post Protect What Matters on Data Privacy Day appeared first on McAfee Blog.

Read More

Protect What Matters on Data Privacy Day

Read Time:4 Minute, 14 Second

Imagine a “Privacy Facts” label on the apps, devices, and websites you use. Like a digital version of the “Nutrition Facts” on the sides of your cereal boxes and other food you buy. With a quick look, you could see what the company behind that app, device, or website collects — and what they do with it. 

Sadly, no such label exists. The fact of privacy today is that it takes work to uncover how the apps, devices, and websites you use collect your personal data and info.  

To uncover those details, you’ll find yourself wading through privacy policies, which are known for their thick legalese. And they can get rather vague. Words like “may” and “might” leave the door open for what companies really do with the personal info and data they collect. They “may” share it with other parties and they “might” sell it to other parties as well.  

Meanwhile, those other parties “may” or “might” use it for their own purposes. Other parties that are largely unknown to you, if not completely unknown, because they’re undisclosed. 

As a result, once your personal data and info gets out there, it has a way of getting around. 

Data and info collection powers the internet, which counts as yet one more fact of privacy. Yet that collection has its legal and ethical boundaries. And those boundaries stand front and center once again this Data Privacy Day.  

Data Privacy Day gives us a chance to consider the importance of respecting privacy, of protecting data, and of building trust. Particularly on the internet, where data is the coin of the realm. It holds great value. Companies want it to improve their services and marketing. Bad actors want it to commit fraud and theft — or sell it on dark marketplaces. 

Your clutch of personal data and info has a price tag hanging on it. That makes it worth protecting. 

Granted, we think about privacy every day. The value it has. The importance of protecting it. And how we can make that protection it stronger and easier for you. That’s very much on our minds in a time where people say they have little idea about what personal data and info gets collected.  

Indeed, plenty of people are scratching their heads about their privacy online. Findings from Pew Research in 2023 showed that roughly three-quarters of Americans surveyed said they feel like they have little or no control over data collectioni. Moreover, 67% of them said they understand little to nothing about what companies are doing with their personal data. That’s up 8% from 59% in 2019ii. 

In four short years, more people feel like protecting their privacy is out of their hands. Even the ripple effects of the European Union’s General Data Protection Regulation (GDPR)iii and strong consumer privacy laws in a dozen or so U.S. statesiv haven’t increased their confidence. Only 61% of Americans feel that anything they do will make much difference when it comes to managing their privacy onlinev 

Yet something else has happened in those four years. Online protection software has become more powerful. Particularly when it comes to privacy. Even if things feel otherwise, you truly can take significant steps that make a difference in your privacy. 

As far as our online protection software goes, it offers several simple and powerful ways to protect your privacy. McAfee+ features Personal Data Cleanup and Online Account Cleanup — two ways you can take control of your data and info. With them, you can: 

Remove your data and info from risky data broker sites.  
Also remove your data and info from old accounts, which makes them one less target for a data breach. 

Further, McAfee+ rounds things out with our VPN. That keeps you anonymous from advertisers and other data collectors, all while securing you from other prying eyes online. 

Those handful of features, part of your overall identity and virus protection, can make you far more private. Even in a time of opaque privacy policies and heavy data collection online. Once again, our aim is to make that simple and powerful for you. 

It really is too bad there’s not a label for privacy. Sure, it’d be nice if you could peer into the Privacy Facts of the apps, devices, and websites you use. But the good news is that online protection software can put you in control of your personal data and info without those details. You truly are in more charge of your privacy than you might feel nowadays. 

[i] https://www.pewresearch.org/internet/2023/10/18/views-of-data-privacy-risks-personal-data-and-digital-privacy-laws/

[ii] https://www.pewresearch.org/internet/2023/10/18/how-americans-view-data-privacy/

[iii] https://gdpr.eu/what-is-gdpr/

[iv] https://pro.bloomberglaw.com/brief/state-privacy-legislation-tracker/

[v] https://www.pewresearch.org/internet/2023/10/18/views-of-data-privacy-risks-personal-data-and-digital-privacy-laws/

The post Protect What Matters on Data Privacy Day appeared first on McAfee Blog.

Read More

Chatbots and Human Conversation

Read Time:6 Minute, 22 Second

For most of history, communicating with a computer has not been like communicating with a person. In their earliest years, computers required carefully constructed instructions, delivered through punch cards; then came a command-line interface, followed by menus and options and text boxes. If you wanted results, you needed to learn the computer’s language.

This is beginning to change. Large language models—the technology undergirding modern chatbots—allow users to interact with computers through natural conversation, an innovation that introduces some baggage from human-to-human exchanges. Early on in our respective explorations of ChatGPT, the two of us found ourselves typing a word that we’d never said to a computer before: “Please.” The syntax of civility has crept into nearly every aspect of our encounters; we speak to this algebraic assemblage as if it were a person—even when we know that it’s not.

Right now, this sort of interaction is a novelty. But as chatbots become a ubiquitous element of modern life and permeate many of our human-computer interactions, they have the potential to subtly reshape how we think about both computers and our fellow human beings.

One direction that these chatbots may lead us in is toward a society where we ascribe humanity to AI systems, whether abstract chatbots or more physical robots. Just as we are biologically primed to see faces in objects, we imagine intelligence in anything that can hold a conversation. (This isn’t new: People projected intelligence and empathy onto the very primitive 1960s chatbot, Eliza.) We say “please” to LLMs because it feels wrong not to.

Chatbots are growing only more common, and there is reason to believe they will become ever more intimate parts of our lives. The market for AI companions, ranging from friends to romantic partners, is already crowded. Several companies are working on AI assistants, akin to secretaries or butlers, that will anticipate and satisfy our needs. And other companies are working on AI therapists, mediators, and life coaches—even simulacra of our dead relatives. More generally, chatbots will likely become the interface through which we interact with all sorts of computerized processes—an AI that responds to our style of language, every nuance of emotion, even tone of voice.

Many users will be primed to think of these AIs as friends, rather than the corporate-created systems that they are. The internet already spies on us through systems such as Meta’s advertising network, and LLMs will likely join in: OpenAI’s privacy policy, for example, already outlines the many different types of personal information the company collects. The difference is that the chatbots’ natural-language interface will make them feel more humanlike—reinforced with every politeness on both sides—and we could easily miscategorize them in our minds.

Major chatbots do not yet alter how they communicate with users to satisfy their parent company’s business interests, but market pressure might push things in that direction. Reached for comment about this, a spokesperson for OpenAI pointed to a section of the privacy policy noting that the company does not currently sell or share personal information for “cross-contextual behavioral advertising,” and that the company does not “process sensitive Personal Information for the purposes of inferring characteristics about a consumer.” In an interview with Axios earlier today, OpenAI CEO Sam Altman said future generations of AI may involve “quite a lot of individual customization,” and “that’s going to make a lot of people uncomfortable.”

Other computing technologies have been shown to shape our cognition. Studies indicate that autocomplete on websites and in word processors can dramatically reorganize our writing. Generally, these recommendations result in blander, more predictable prose. And where autocomplete systems give biased prompts, they result in biased writing. In one benign experiment, positive autocomplete suggestions led to more positive restaurant reviews, and negative autocomplete suggestions led to the reverse. The effects could go far beyond tweaking our writing styles to affecting our mental health, just as with the potentially depression- and anxiety-inducing social-media platforms of today.

The other direction these chatbots may take us is even more disturbing: into a world where our conversations with them result in our treating our fellow human beings with the apathy, disrespect, and incivility we more typically show machines.

Today’s chatbots perform best when instructed with a level of precision that would be appallingly rude in human conversation, stripped of any conversational pleasantries that the model could misinterpret: “Draft a 250-word paragraph in my typical writing style, detailing three examples to support the following point and cite your sources.” Not even the most detached corporate CEO would likely talk this way to their assistant, but it’s common with chatbots.

If chatbots truly become the dominant daily conversation partner for some people, there is an acute risk that these users will adopt a lexicon of AI commands even when talking to other humans. Rather than speaking with empathy, subtlety, and nuance, we’ll be trained to speak with the cold precision of a programmer talking to a computer. The colorful aphorisms and anecdotes that give conversations their inherently human quality, but that often confound large language models, could begin to vanish from the human discourse.

For precedent, one need only look at the ways that bot accounts already degrade digital discourse on social media, inflaming passions with crudely programmed responses to deeply emotional topics; they arguably played a role in sowing discord and polarizing voters in the 2016 election. But AI companions are likely to be a far larger part of some users’ social circle than the bots of today, potentially having a much larger impact on how those people use language and navigate relationships. What is unclear is whether this will negatively affect one user in a billion or a large portion of them.

Such a shift is unlikely to transform human conversations into cartoonishly robotic recitations overnight, but it could subtly and meaningfully reshape colloquial conversation over the course of years, just as the character limits of text messages affected so much of colloquial writing, turning terms such as LOL, IMO, and TMI into everyday vernacular.

AI chatbots are always there when you need them to be, for whatever you need them for. People aren’t like that. Imagine a future filled with people who have spent years conversing with their AI friends or romantic partners. Like a person whose only sexual experiences have been mediated by pornography or erotica, they could have unrealistic expectations of human partners. And the more ubiquitous and lifelike the chatbots become, the greater the impact could be.

More generally, AI might accelerate the disintegration of institutional and social trust. Technologies such as Facebook were supposed to bring the world together, but in the intervening years, the public has become more and more suspicious of the people around them and less trusting of civic institutions. AI may drive people further toward isolation and suspicion, always unsure whether the person they’re chatting with is actually a machine, and treating them as inhuman regardless.

Of course, history is replete with people claiming that the digital sky is falling, bemoaning each new invention as the end of civilization as we know it. In the end, LLMs may be little more than the word processor of tomorrow, a handy innovation that makes things a little easier while leaving most of our lives untouched. Which path we take depends on how we train the chatbots of tomorrow, but it also depends on whether we invest in strengthening the bonds of civil society today.

This essay was originally published in The Atlantic.

Read More