Everything Ransomware: Ransomware Live
Check it out! Really interesting tracking of everything related to ransomware. https://www.ransomware.live/ Ransomware Live is a real-time intelligence site tracking active ransomware groups, victims, leaks, and extortion activity, helping security teams monitor threats, trends, and attacker behavior worldwide.30Views1like0CommentsRansomware attacks are NOT going away
Here is why ransomware attacks are persistent and unlikely to disappear: 1. High Profitability and Low Risk for Criminals Ransomware is fundamentally a business model for organized crime, and it is overwhelmingly successful and profitable. Low Barrier to Entry: The rise of Ransomware-as-a-Service (RaaS) means even novice criminals can purchase sophisticated malware and infrastructure. This franchise model ensures high attack volume regardless of law enforcement efforts. Guaranteed Revenue Stream: The evolution to multi-extortion (encrypting data and stealing it) ensures that victims are forced to pay—either to regain system access or to prevent catastrophic data leaks and regulatory fines. This dual leverage guarantees profit even if the victim has backups. Anonymity: The use of cryptocurrency for payments, coupled with geopolitical safe zones for many RaaS groups, keeps the risk of prosecution extremely low for the attackers. 2. Attackers Are Outpacing Traditional Defenses The tactics used by ransomware groups are specifically designed to neutralize traditional defense and recovery measures: Targeting the Supply Chain: Attackers are finding success by targeting trusted vendors and IT providers to compromise dozens of companies simultaneously, making defense exponentially harder for individual organizations. Attacking Backups: Modern ransomware campaigns specifically target accessible backups to delete them or malware-infect them, eliminating the victim’s recovery option and forcing them to pay the ransom. AI for Stealth and Speed: The adoption of AI is accelerating reconnaissance and stealth, dramatically compressing the time between network access and payload deployment. Attackers can move faster than human defenders can react. 3. Cyber Resilience is the New Standard The industry has shifted its mindset from trying to achieve absolute prevention (which is impossible) to guaranteeing resilience. This shift acknowledges the persistence of ransomware. The focus is now on ensuring organizations can: Anticipate and detect threats early (low MTTD). Withstand the attack without immediate operational collapse. Recover guaranteed clean data within minutes (low MTTR). Ransomware will not disappear until the criminal model becomes unprofitable, and current data shows that attackers are highly successful and rapidly adapting their strategies.21Views0likes0CommentsUsing Azure Backup with SQL Server on Azure VM with PSC Dedicated volume
This post details a brief test of Azure Backup for SQL databases. The objective was to evaluate Azure's native backup solution when the SQL Server database, residing on an Azure Virtual Machine, had its data files stored on a Pure Storage Cloud (PSC) Dedicated volume. As a disclaimer, many variations of this test can be done and results may vary, intention here is only to record my recent experience with the technology. The setup I set-up the following environment for the test: PSC Dedicated V20MP2R2 deployed into West Europe region Azure VM Standard D8ds v5 (8 vcpus, 32 GiB memory) running Windows Server 2019 Datacenter gen 2 MS SQL 2019 evaluation edition installed manually on the VM A connection was made between PSC Dedicated and the SQL VM, with MPIO enabled on the VM and one iSCSI session established to each controller. On the PSC Dedicated, I mounted 1TB volume to the host, which I then formatted to NTFS with 64k allocation unit size and mounted as drive F: on the Windows OS. To populate the SQL server with test databases, I made an empty testDB and downloaded 2 sample databases from MS site (AdventureWorks and AdventureWorksLT). The PSC Dedicated volume (F:) was used as the Data Directory of the SQL Server. On the Azure side, Recovery Services Vault (via Backup and Site Recovery from the Marketplace) was created and the VM running the SQL server discovered. The service installs an agent on the VM, enabling it to discover SQL databases instances. Furthermore, an NT Service\AzureWLBackupPluginSvc account used for orchestration of the backup is created on the SQL Server side. The backup For the purpose of the test, I set only a basic policy for all our databases. All three were discovered without issues (along with default ones). Assigning backup policies to databases creates backup items. These can be reviewed on the Azure Portal. Similarly for backup jobs, navigating to the Vault resources allows to review the type of operation (configuration, type of backup etc.) and status. In this case, all completed as scheduled (including an additional manual backup created separately). Another place where it is possible to review the state of the backup is Azure Business Continuity Center. The Restore Testing the restore is a crucial part of any backup, as without that, all we have is a Schrodinger's backup - it might work or it might not. In the testDB I created a small sample table. This only contains a few names as examples. I ran a manual backup of the testDB to capture its current state. Then deleted an item. Bye Bob. Confirmed Bob is gone. Afterwards, I run the restore operation, selecting the manual backup created in the previous step. The restore operation was triggered. And after successful completion, Bob was back. The restore can also be confirmed in the Azure portal. The Summary The testing recorded above indicates the Azures Backup for SQL Server running on Azure VM can be used to discover databases within the SQL instance and help with setting up protection, even if the data resides on an external storage such as Pure Storage Cloud Dedicated. As mentioned in the disclaimer in the beginning of this post however, there may be scenarios that could show different results, so always run tests before committing any decision to production environments.41Views0likes0CommentsMFA Downgrade Attacks: Good to know.
Short article on MFA downgrade attacks; provides the basics on what it is and how to defend. Good to know for considering your own policies and processes when folks lose devices. https://www.scworld.com/perspective/why-mfa-downgrade-attacks-could-be-the-next-ai-security-crisis73Views0likes1CommentPure Storage Cloud: Run It Your Way or Fully Managed
November 6 | Register Now! Cloud shouldn't limit your options regardless if it's private, public, or hybrid. It's your data. It should be your choice to move it, and it should be your choice to determine where your workloads should run. See how Pure Storage Cloud gives you the freedom to choose your management model—customer-managed services that let you run it your way to a completely seamless, fully managed solution. Join us as we explore the Pure Storage Cloud portfolio, which brings the performance and agility of Pure Storage everywhere—giving you the power to choose. This session will explore: Efficient storage options for your workloads, no matter where they run The freedom that Pure Storage gives you for workload mobility The latest options for Pure Storage to do the heavy lifting in the public cloud Register Now!42Views0likes0Comments💡 New Pure360 Walkthrough: Using FlashArray File Services as a Veeam Backup Repository
Hey everyone — I've got a new Pure360 technical walkthrough that answers a question we hear a LOT! “Can I use an SMB share from FlashArray as a Veeam backup repository?” ✅ Short answer: Yes, you can. And in this demo, we show exactly how to set it up step by step. You’ll see how to: Configure FlashArray File Services to present an SMB share Create the right export and quota policies Add that share as a Veeam Backup & Replication repository Verify your configuration by running a backup job and seeing data written directly to FlashArray It’s a quick but detailed walkthrough for anyone managing Veeam environments who wants to take advantage of FlashArray’s performance and simplicity. 🎥 Check out the full video on Pure360 to see the process in action. Have you set up FlashArray File Services as a Veeam repo in your own environment? Drop your experience or tips below — we’d love to hear how you’re integrating Pure with your backup workflows! -Jason46Views0likes0Comments🔶 New Pure360 Demo Video: Why Architecture Matters for Your Data Protection Strategy
In the world of enterprise IT, protecting your organization’s data isn’t getting any easier. Between sprawling virtual environments, hybrid cloud decisions, and the constant evolution of threats, stitching together point solutions simply isn’t enough. That’s why architecture matters — especially when it comes to your storage and backup strategy. In our latest Pure360 demo video, Principal Technologist Allynz (Zane Allyn) dives into how Pure Storage and Veeam create a unified architecture that transforms the way you think about data protection on VMware. You’ll learn how the tight integration between Pure, Veeam, and VMware enables: Smarter snapshot and backup orchestration Instant, application-consistent recovery across multiple scenarios High-performance operations that don’t impact production workloads Seamless extension of data protection into your DR site This isn’t about tools—it’s about an architecture built for resilience, automation, and efficiency across your environment. 🎥 Watch the full video now on the Pure360 demo site: VIDEO Join the discussion below — how is your team rethinking data protection architecture in the face of modern challenges? -Jason41Views2likes0CommentsWhen Security Awareness Fails: Effective Cyber Recovery
October 30 | Register Now! Based on Ponemon Institute research, cyber incidents reveal that recovery times are not hours, but days with organizations frequently suffering data loss. Despite the best efforts of cybersecurity awareness and security investments, organizations still need to amplify their recovery strategies to minimize business disruption and data integrity. This session will explore: How organizations should continue investing in cybersecurity Why measure cyber recovery in minutes and hours vs. days Using new cyber resilient storage capabilities to speed up recovery The cross-functional and cross-technology cyber recovery process and the importance of rehearsals and testing Register Now!59Views0likes0Comments