FEDORA-MODULAR-2022-eb8fed52d8
Packages in this update:
mariadb-10.5-3420220428160949.058368ca
Update description:
MariaDB 10.5.15 & Galera 26.4.11
Release notes:
mariadb-10.5-3420220428160949.058368ca
MariaDB 10.5.15 & Galera 26.4.11
Release notes:
mariadb-10.5-3620220428160949.5e5ad4a0
MariaDB 10.5.15 & Galera 26.4.11
Release notes:
This is rare:
An about 3-meter-long giant squid was found stranded on a beach here on April 20, in what local authorities said was a rare occurrence.
At around 10 a.m., a nearby resident spotted the squid at Ugu beach in Obama, Fukui Prefecture, on the Sea of Japan coast. According to the Obama Municipal Government, the squid was still alive when it was found. It is unusual for a giant squid to be washed ashore alive, officials said.
The deep-sea creature will be transported to Echizen Matsushima Aquarium in the prefectural city of Sakai.
Sadly, I do not expect the giant squid to survive, certainly not long enough for me to fly there and see it. But if any Japanese readers can supply more information, I would very much appreciate it.
BoingBoing <a href=”https://boingboing.net/2022/04/26/giant-squid-rescued-from-obama-in-japan.html>post. Video.
As usual, you can also use this squid post to talk about the security stories in the news that I haven’t covered.
Read my blog posting guidelines here.
Google said this week it is expanding the types of data people can ask to have removed from search results, to include personal contact information like your phone number, email address or physical address. The move comes just months after Google rolled out a new policy enabling people under the age of 18 (or a parent/guardian) to request removal of their images from Google search results.
Google has for years accepted requests to remove certain sensitive data such as bank account or credit card numbers from search results. In a blog post on Wednesday, Google’s Michelle Chang wrote that the company’s expanded policy now allows for the removal of additional information that may pose a risk for identity theft, such as confidential log-in credentials, email addresses and phone numbers when it appears in Search results.
“When we receive removal requests, we will evaluate all content on the web page to ensure that we’re not limiting the availability of other information that is broadly useful, for instance in news articles,” Chang wrote. “We’ll also evaluate if the content appears as part of the public record on the sites of government or official sources. In such cases, we won’t make removals.”
Google says a removal request will be considered if the search result in question includes the presence of “explicit or implicit threats” or “explicit or implicit calls to action for others to harm or harass.” The company says if it approves your request, it may respond by removing the provided URL(s) for all queries, or for only queries including your name.
While Google’s removal of a search result from its index will do nothing to remove the offending content from the site that is hosting it, getting a link decoupled from Google search results is going to make the content at that link far less visible. According to recent estimates, Google enjoys somewhere near 90 percent market share in search engine usage.
KrebsOnSecurity decided to test this expanded policy with what would appear to be a no-brainer request: I asked Google to remove search result for BriansClub, one of the largest (if not THE largest) cybercrime stores for selling stolen payment card data.
BriansClub has long abused my name and likeness to pimp its wares on the hacking forums. Its homepage includes a copy of my credit report, Social Security card, phone bill, and a fake but otherwise official looking government ID card.
Briansclub updated its homepage with this information in 2019, after it got massively hacked and a copy of its customer database was shared with this author. The leaked data — which included 26 million credit and debit card records taken from hacked online and brick-and-mortar retailers — was ultimately shared with dozens of financial institutions.
TechCrunch writes that the policy expansion comes six months after Google started allowing people under 18 or their parents request to delete their photos from search results. To do so, users need to specify that they want Google to remove “Imagery of an individual currently under the age of 18” and provide some personal information, the image URLs and search queries that would surface the results. Google also lets you submit requests to remove non-consensual explicit or intimate personal images from Google, along with involuntary fake pornography, TechCrunch notes.
This post will be updated in the event Google responds one way or the other, but that may take a while: Google’s automated response said: “Due to the preventative measures being taken for our support specialists in light of COVID-19, it may take longer than usual to respond to your support request. We apologize for any inconvenience this may cause, and we’ll send you a reply as soon as we can.”
Cyber-attack on booking system exposes personal data of thousands of high-end hotel guests
Bioeconomy ISAC and New York Metro InfraGard Members Alliance announce partnership
Texas school district employee caught mining cryptocurrency at school quits job
If you are worried about the financial hit of paying a ransom to cybercriminals, wait until you find out the true cost of a ransomware attack.
Read more in my article on the Tripwire State of Security blog.
New research: “Are You Really Muted?: A Privacy Analysis of Mute Buttons in Video Conferencing Apps“:
Abstract: In the post-pandemic era, video conferencing apps (VCAs) have converted previously private spaces — bedrooms, living rooms, and kitchens — into semi-public extensions of the office. And for the most part, users have accepted these apps in their personal space, without much thought about the permission models that govern the use of their personal data during meetings. While access to a device’s video camera is carefully controlled, little has been done to ensure the same level of privacy for accessing the microphone. In this work, we ask the question: what happens to the microphone data when a user clicks the mute button in a VCA? We first conduct a user study to analyze users’ understanding of the permission model of the mute button. Then, using runtime binary analysis tools, we trace raw audio in many popular VCAs as it traverses the app from the audio driver to the network. We find fragmented policies for dealing with microphone data among VCAs — some continuously monitor the microphone input during mute, and others do so periodically. One app transmits statistics of the audio to its telemetry servers while the app is muted. Using network traffic that we intercept en route to the telemetry server, we implement a proof-of-concept background activity classifier and demonstrate the feasibility of inferring the ongoing background activity during a meeting — cooking, cleaning, typing, etc. We achieved 81.9% macro accuracy on identifying six common background activities using intercepted outgoing telemetry packets when a user is muted.
The paper will be presented at PETS this year.
News article.
converseen-0.9.8.1-2.el8
digikam-6.4.0-5.el8
dvdauthor-0.7.2-16.el8
ImageMagick-6.9.12.44-1.el8
ImageMagick 6.9.12.x with a bunch security fixes