firefox-flatpak-125.0.2-1

Read Time:16 Second

FEDORA-FLATPAK-2024-57e9bcf6a3

Packages in this update:

firefox-flatpak-125.0.2-1

Update description:

Firefox 125.0 release. For details, see https://www.mozilla.org/en-US/firefox/125.0/releasenotes/

Please note that this update depends on the flatpak runtime update from https://bodhi.fedoraproject.org/updates/FEDORA-FLATPAK-2024-a3977e7532

Read More

Russian FSB Counterintelligence Chief Gets 9 Years in Cybercrime Bribery Scheme

Read Time:3 Minute, 13 Second

The head of counterintelligence for a division of the Russian Federal Security Service (FSB) was sentenced last week to nine years in a penal colony for accepting a USD $1.7 million bribe to ignore the activities of a prolific Russian cybercrime group that hacked thousands of e-commerce websites. The protection scheme was exposed in 2022 when Russian authorities arrested six members of the group, which sold millions of stolen payment cards at flashy online shops like Trump’s Dumps.

A now-defunct carding shop that sold stolen credit cards and invoked 45’s likeness and name.

As reported by The Record, a Russian court last week sentenced former FSB officer Grigory Tsaregorodtsev for taking a $1.7 million bribe from a cybercriminal group that was seeking a “roof,” a well-placed, corrupt law enforcement official who could be counted on to both disregard their illegal hacking activities and run interference with authorities in the event of their arrest.

Tsaregorodtsev was head of the counterintelligence department for a division of the FSB based in Perm, Russia. In February 2022, Russian authorities arrested six men in the Perm region accused of selling stolen payment card data. They also seized multiple carding shops run by the gang, including Ferum Shop, Sky-Fraud, and Trump’s Dumps, a popular fraud store that invoked the 45th president’s likeness and promised to “make credit card fraud great again.”

All of the domains seized in that raid were registered by an IT consulting company in Perm called Get-net LLC, which was owned in part by Artem Zaitsev — one of the six men arrested. Zaitsev reportedly was a well-known programmer whose company supplied services and leasing to the local FSB field office.

The message for Trump’s Dumps users left behind by Russian authorities that seized the domain in 2022.

Russian news sites report that Internal Affairs officials with the FSB grew suspicious when Tsaregorodtsev became a little too interested in the case following the hacking group’s arrests. The former FSB agent had reportedly assured the hackers he could have their case transferred and that they would soon be free.

But when that promised freedom didn’t materialize, four the of the defendants pulled the walls down on the scheme and brought down their own roof. The FSB arrested Tsaregorodtsev, and seized $154,000 in cash, 100 gold bars, real estate and expensive cars.

At Tsaregorodtsev’s trial, his lawyers argued that their client wasn’t guilty of bribery per se, but that he did admit to fraud because he was ultimately unable to fully perform the services for which he’d been hired.

The Russian news outlet Kommersant reports that all four of those who cooperated were released with probation or correctional labor. Zaitsev received a sentence of 3.5 years in prison, and defendant Alexander Kovalev got four years.

In 2017, KrebsOnSecurity profiled Trump’s Dumps, and found the contact address listed on the site was tied to an email address used to register more than a dozen domains that were made to look like legitimate Javascript calls many e-commerce sites routinely make to process transactions — such as “js-link[dot]su,” “js-stat[dot]su,” and “js-mod[dot]su.”

Searching on those malicious domains revealed a 2016 report from RiskIQ, which shows the domains featured prominently in a series of hacking campaigns against e-commerce websites. According to RiskIQ, the attacks targeted online stores running outdated and unpatched versions of shopping cart software from Magento, Powerfront and OpenCart.

Those shopping cart flaws allowed the crooks to install “web skimmers,” malicious Javascript used to steal credit card details and other information from payment forms on the checkout pages of vulnerable e-commerce sites. The stolen customer payment card details were then sold on sites like Trump’s Dumps and Sky-Fraud.

Read More

Using Legitimate GitHub URLs for Malware

Read Time:59 Second

Interesting social-engineering attack vector:

McAfee released a report on a new LUA malware loader distributed through what appeared to be a legitimate Microsoft GitHub repository for the “C++ Library Manager for Windows, Linux, and MacOS,” known as vcpkg.

The attacker is exploiting a property of GitHub: comments to a particular repo can contain files, and those files will be associated with the project in the URL.

What this means is that someone can upload malware and “attach” it to a legitimate and trusted project.

As the file’s URL contains the name of the repository the comment was created in, and as almost every software company uses GitHub, this flaw can allow threat actors to develop extraordinarily crafty and trustworthy lures.

For example, a threat actor could upload a malware executable in NVIDIA’s driver installer repo that pretends to be a new driver fixing issues in a popular game. Or a threat actor could upload a file in a comment to the Google Chromium source code and pretend it’s a new test version of the web browser.

These URLs would also appear to belong to the company’s repositories, making them far more trustworthy.

Read More

flatpak-runtime-f40-2 flatpak-sdk-f40-1

Read Time:15 Second

FEDORA-FLATPAK-2024-a3977e7532

Packages in this update:

flatpak-runtime-f40-2
flatpak-sdk-f40-1

Update description:

Updated flatpak runtime and SDK, including latest Fedora 40 security and bug-fix errata.

In addition, this update also includes updated nss 3.99.0 that’s needed for upcoming firefox 125.0 update.

Read More

How to Spot AI Audio Deepfakes at Election Time

Read Time:5 Minute, 19 Second

We’ve said it several times in our blogs — it’s tough knowing what’s real and what’s fake out there. And that’s absolutely the case with AI audio deepfakes online. 

Bad actors of all stripes have found out just how easy, inexpensive, and downright uncanny AI audio deepfakes can be. With only a few minutes of original audio, seconds even, they can cook up phony audio that sounds like the genuine article — and wreak all kinds of havoc with it. 

A few high-profile cases in point, each politically motivated in an election year where the world will see more than 60 national elections: 

In January, thousands of U.S. voters in New Hampshire received an AI robocall that impersonated President Joe Biden, urging them not to vote in the primary 
In the UK, more than 100 deepfake social media ads impersonated Prime Minister Rishi Sunak on the Meta platform last December.i  
Similarly, the 2023 parliamentary elections in Slovakia spawned deepfake audio clips that featured false proposals for rigging votes and raising the price of beer.ii 

Yet deepfakes have targeted more than election candidates. Other public figures have found themselves attacked as well. One example comes from Baltimore County in Maryland, where a high school principal has allegedly fallen victim to a deepfake attack.  

It involves an offensive audio clip that resembles the principal’s voice which was posted on social media, news of which spread rapidly online. The school’s union has since stated that the clip was an AI deepfake, and an investigation is ongoing.iii In the wake of the attack, at least one expert in the field of AI deepfakes said that the clip is likely a deepfake, citing “distinct signs of digital splicing; this may be the result of several individual clips being synthesized separately and then combined.”iv 

And right there is the issue. It takes expert analysis to clinically detect if an audio clip is an AI deepfake. 

What makes audio deepfakes so hard to spot?  

Audio deepfakes give off far fewer clues, as compared to the relatively easier-to-spot video deepfakes out there. Currently, video deepfakes typically give off several clues, like poorly rendered hands and fingers, off-kilter lighting and reflections, a deadness to the eyes, and poor lip-syncing. Clearly, audio deepfakes don’t suffer any of those issues. That indeed makes them tough to spot. 

The implications of AI audio deepfakes online present themselves rather quickly. In a time where general awareness of AI audio deepfakes lags behind the availability and low cost of deepfake tools, people are more prone to believe an audio clip is real. Until “at home” AI detection tools become available to everyday people, skepticism is called for.  

Just as “seeing isn’t always believing” on the internet, we can “hearing isn’t always believing” on the internet as well. 

How to spot audio deepfakes. 

The people behind these attacks have an aim in mind. Whether it’s to spread disinformation, ruin a person’s reputation, or conduct some manner of scam, audio deepfakes look to do harm. In fact, that intent to harm is one of the signs of an audio deepfake, among several others. 

Listen to what’s actually being said. In many cases, bad actors create AI audio deepfakes designed to build strife, deepen divisions, or push outrageous lies. It’s an age-old tactic. By playing on people’s emotions, they ensure that people will spread the message in the heat of the moment. Is a political candidate asking you not to vote? Is a well-known public figure “caught” uttering malicious speech? Is Taylor Swift offering you free cookware? While not an outright sign of an AI audio deepfake alone, it’s certainly a sign that you should verify the source before drawing any quick conclusions. And certainly before sharing the clip. 

Think of the person speaking. If you’ve heard them speak before, does this sound like them? Specifically, does their pattern of speech ring true or does it pause in places it typically doesn’t … or speak more quickly and slowly than usual? AI audio deepfakes might not always capture these nuances. 

Listen to their language. What kind of words are they saying? Are they using vocabulary and turns of phrase they usually don’t? An AI can duplicate a person’s voice, yet it can’t duplicate their style. A bad actor still must write the “script” for the deepfake, and the phrasing they use might not sound like the target. 

Keep an ear out for edits. Some deepfakes stitch audio together. AI audio tools tend to work better with shorter clips, rather than feeding them one long script. Once again, this can introduce pauses that sound off in some way and ultimately affect the way the target of the deepfake sounds. 

Is the person breathing? Another marker of a possible fake is when the speaker doesn’t appear to breathe. AI tools don’t always account for this natural part of speech. It’s subtle, yet when you know to listen for it, you’ll notice it when a person doesn’t pause for breath. 

Living in a world of AI audio deepfakes. 

It’s upon us. Without alarmism, we should all take note that not everything we see, and now hear, on the internet is true. The advent of easy, inexpensive AI tools has made that a simple fact. 

The challenge that presents us is this — it’s largely up to us as individuals to sniff out a fake. Yet again, it comes down to our personal sense of internet street smarts. That includes a basic understanding of AI deepfake technology, what it’s capable of, and how fraudsters and bad actors put it to use. Plus, a healthy dose of level-headed skepticism. Both now in this election year and moving forward. 

[i] https://www.theguardian.com/technology/2024/jan/12/deepfake-video-adverts-sunak-facebook-alarm-ai-risk-election

[ii] https://www.bloomberg.com/news/articles/2023-09-29/trolls-in-slovakian-election-tap-ai-deepfakes-to-spread-disinfo

[iii] https://www.baltimoresun.com/2024/01/17/pikesville-principal-alleged-recording/

[iv] https://www.scientificamerican.com/article/ai-audio-deepfakes-are-quickly-outpacing-detection/

The post How to Spot AI Audio Deepfakes at Election Time appeared first on McAfee Blog.

Read More