When Data Becomes the Mission
Why state and local government, cities, and research universities are reorganizing infrastructure around data itself If you remember one thing from this article: infrastructure used to organize around applications. Increasingly, now it organizes around data. If you spend enough time around enterprise infrastructure, you start to notice something about how conversations begin. Someone asks about storage. Not in a philosophical way. In a practical way. How much capacity do we have left? What’s the refresh cycle? Is this staying on premises or moving to cloud? What’s the backup strategy? For years, that framing made perfect sense. Infrastructure was the foundation, and the job of infrastructure teams was to keep the lights on and the foundation solid. But lately, in conversations with customers across state and local government, municipalities, cities, and universities, something feels different. Because eventually someone says something like this: “We have this data… but we can’t actually use it.” And that is when the real conversation begins. Why the public sector reveals the truth about data There’s a perspective I heard recently that stuck with me. The public sector isn’t a niche market. It’s a microcosm of the entire enterprise technology world. At first that sounds counterintuitive. The stereotype is that government IT has been quietly living under a rock since the previous century, next to a beige server and a stack of COBOL manuals. But if you look closely, the opposite is true. State agencies, cities, and research institutions operate in environments that combine nearly every architectural challenge the private sector faces — all at once. Massive datasets Highly distributed users Strict security requirements Long retention policies Global collaboration And an absolute requirement that systems remain available when people need them most. In other words, the public sector experiences the full spectrum of data challenges simultaneously. If you want to stress-test a data architecture, put it inside government. Think about it. A state government may run thousands of systems across dozens of agencies, each serving different missions but increasingly sharing the same underlying data. A city manages infrastructure at the physical edge of society — traffic, water, SCADA, emergency services — where real-time decisions depend on accurate information. Universities generate some of the largest research datasets on earth while collaborating across institutions and countries. Each of these environments demands something slightly different from infrastructure. But they all demand the same thing from data: Security. Integrity. Mobility. Context. Availability. And when those requirements collide in one environment, something interesting happens. The solutions that work there tend to work everywhere. A laboratory for the modern data enterprise This is why many technology leaders quietly view the public sector as something more than a vertical market. It’s a laboratory for enterprise-scale data architecture. If a platform can operate in a world where: sensitive personal data must remain protected • systems span thousands of locations • regulatory oversight is constant • and uptime has real public consequences …then that architecture will almost certainly succeed in commercial environments. Banks, manufacturers, healthcare providers, and global enterprises face the same challenges. Just rarely all at once. Government simply compresses those problems into a single environment. Solve the data problem for government, and you solve it for the enterprise. That’s one reason the shift toward data-centric platforms is becoming so important. When organizations treat infrastructure as a place to store files, they solve only a small part of the problem. But when they treat data as the central operational asset — something that must be understood, governed, protected, and made usable across environments — the architecture begins to look very different. And the public sector, with all its complexity, becomes the place where those architectures are tested first. Which brings us back to the shift we’re seeing across the industry. Because once you start looking at infrastructure through the lens of data itself, something else becomes obvious. The center of gravity has moved. When multiple systems depend on the same dataset, the data becomes part of the operating foundation. And once that happens, moving it — or even restructuring it — becomes dramatically harder. Which brings us to the concept that explains a lot of what is happening right now. The quiet physics of data gravity The first time I heard the term “data gravity” wasn’t in a conference keynote or a vendor presentation. It was in 2015, when a recruiter from a startup called DataGravity (now Anomalo) reached out and asked if I would be interested in interviewing. At the time, the idea sounded fascinating — and slightly theoretical. The company was built around the premise that data itself was becoming the most valuable asset in the data center, and that infrastructure needed to understand the content, context, and behavior of data, not just store it. The name alone hinted at something deeper: the idea that as datasets grow, they start exerting a kind of gravitational pull on the systems around them. Back then, it felt like an interesting concept. Today it feels like a description of reality. The term “data gravity” itself was introduced by Dave McCrory back in 2010, and it turns out to be a remarkably accurate way to describe modern infrastructure. Dave McCrory Blog The idea is simple. As datasets grow, they become harder to move. More applications depend on them. More workflows connect to them. More policies govern them. Eventually, the architecture starts organizing around the data itself. Not because someone designed it that way. Because the physics of large systems leave you very little choice. Imagine trying to relocate a state Medicaid dataset that has been integrated with multiple benefit programs, identity verification systems, and fraud detection tools. Technically possible? Sure. Operationally trivial? Not even close. The larger and more interconnected the dataset becomes, the stronger its gravitational pull. Compute moves closer to the data. Applications move closer to the data. Infrastructure reorganizes around the data. This is why organizations that once talked primarily about storage capacity are now talking about data platforms. The center of gravity moved. When data stops being passive The moment data becomes operational, everything changes. For years, most organizations treated data as something that accumulated quietly inside systems. Applications produced it. Storage kept it safe. Backups made sure it could be restored. But that model starts to break down when the data itself becomes part of real-time decision making. You can see this most clearly in environments that generate enormous volumes of information. Cities now run infrastructure that continuously streams telemetry — traffic sensors, utility meters, environmental monitors, emergency response platforms. A water meter that once reported usage once a month might now generate thousands of readings per year. A traffic system that once relied on static timing can adapt dynamically to real-time conditions. Each improvement creates more data. More importantly, it creates operational dependence on that data. Universities experience the same phenomenon in a different form. Research environments produce extraordinary datasets across genomics, climate science, and artificial intelligence. Sequencing a single human genome generates roughly 100 gigabytes of raw data, and large research programs may create terabytes or petabytes of new information every week. In those environments the challenge isn’t just storing data. It’s feeding it fast enough to the systems that depend on it. Modern research clusters and GPU environments can process enormous volumes of information, but only if the underlying data pipeline keeps up. When storage cannot deliver data fast enough, expensive compute resources sit idle and discovery slows down. And that reveals an important truth about modern infrastructure. When systems depend on data in real time, the question stops being where the infrastructure lives. The question becomes whether the data is available, trustworthy, and recoverable. That distinction also explains why ransomware has become so disruptive to public institutions. Attackers understand that the real leverage is not the servers or the network. It’s the data. When access to data disappears, the services built on top of it disappear as well. Which brings us back to the deeper shift happening across the industry. If data has become this central to operations, services, and discovery, then managing it as a passive byproduct of infrastructure is no longer enough. Infrastructure alone is no longer the strategic layer. The strategic layer is the data itself. Organizations still need performance, availability, and resilience. Those fundamentals have not changed. What has changed is the expectation that infrastructure should also help organizations understand, govern, protect, and use their data more effectively. That is a very different problem than simply storing it. And it is the reason the conversation is evolving from storage management to data management platforms. The real punch line Public sector organizations didn’t set out to become data enterprises. Over time the data accumulated. Then the dependencies formed. And eventually everything started orbiting the datasets that mattered most. Data has gravity. Data has risk. Data has power. Infrastructure still matters. But increasingly, the real mission is something else entirely. The mission is the data. Appreciate you reading. Dmitry Gorbatov © 2025 Dmitry Gorbatov | #dmitrywashere39Views0likes0CommentsSpring is Calling, and so is Reds Baseball
I don't know about you, but I am more than ready for Spring; though I could definitely skip the rain. Wiping muddy dog paws after every walk is getting old! On the bright side, who else is ready for some Reds baseball? I have a few exciting updates and resources to share with the community: 🚀 PUG Meeting Update charles_sheppar and I are currently hard at work on the next PUG meeting. Details to come. 🛡️ Strengthening Your Cyber Resilience Given the current geopolitical climate and the rise in cyber threats, now is the perfect time to audit your data protection. Features like SafeMode and Pure1 Security Assessments act as a resilient last line of defense. If you want to see these tools in action, we recently hosted an expert-led demo on building a foundation for cyber resilience. Watch the recording here: https://www.purestorage.com/video/webinars/the-foundations-of-cyber-resilience/6389889927112.html Questions? Reach out to your Everpure SE or partner for a deeper dive. 📅 Upcoming Events March 12: Nutanix Webinar Exploring virtualization alternatives? Nutanix is hosting a session tomorrow focused on simplifying IT operations and highlighting the Everpure partnership. https://event.nutanix.com/simplifyitandonprem March 19: Or perhaps you're interested in running virtual machines alongside containerized workloads within K8s clusters. If that's the case, join Greg McNutt and Sagar Srinivasa for Virtualization Reimagined: Inside the Everpure Journey. https://www.purestorage.com/events/webinars/virtualization-reimagined.html March 19: Ask Us Everything About Storage for Databases. Join experts Anthony Nocentino, Ryan Arsenault, and Don Poorman for a live Q&A session. https://www.purestorage.com/events/webinars/ask-us-everything-about-storage-for-databases.html March 24: Presets & Workloads for Consistent DB Environments. We’re extending the database conversation to discuss how Everpure helps you transition from "managing storage" to "managing data" through automated presets. https://www.purestorage.com/events/webinars/presets-and-workload-setups-for-consistent-database-environments.html92Views1like0CommentsLevel Up Your Virtualization Game
The virtualization landscape doesn’t stop evolving—and neither should your strategy. Join us at Topgolf King of Prussia on April 15 for an in-person Everpure User Group session featuring Cody Hosterman, Senior Director – Product Management. Cody will walk through the latest shifts across core virtualization technologies and what they mean for your environment today and tomorrow. This interactive, technical conversation is designed for practitioners and architects who want real-world guidance—not just slideware. What We’ll Cover Cody will break down what’s new, what’s next, and what actually matters across: VMware & Hyper-V – Current state, roadmap signals, and practical considerations OpenShift & OpenStack – Where they fit, how they’re evolving, and key design decisions New Entrants & Emerging Platforms – Who’s worth watching and why Expect candid discussion, best practices, and plenty of time for Q&A with your peers and the Everpure team. Register here to join in!102Views0likes0CommentsWe are just one week away PUG#3
January 28th, the Cincinnati Pure User Group will be convening at Ace's Pickleball to discuss Enterprise file. We will be joined by Matt Niederhelman Unstructured Data Field Solutions Architect to help guide conversation and answer questions about what he is experiencing amongst other customers. Click the link below to register and come join us. Help us guide the conversation with your ideas for future topics. https://info.purestorage.com/2025-Q4AMS-COMREPLTFSCincinnatiPUG-LP_01---Registration-Page.html56Views1like0CommentsWho's using Pure Protect?
Hey everyone, Just wondering if anyone else is using Pure Protect yet. We have gone through the quick start guide and have a VMWare to VMWare configuration setup. We have configured our first policy and group utilizing a test VM but it seems to be stuck in the protection phase. I would be very interested to hear what others have seen or experienced. -Charles493Views2likes4CommentsPittsburgh PUG - Launch Party @ River's Casino Drum Bar
You’re Invited! PUG - Time to Launch REGISTER NOW --> Celebrating Pure Storage + Nutanix + Expedient Join us for a special Pure User Group event as we celebrate the launch of two of the industry’s most loved technologies: Pure Storage and Nutanix are coming together in a powerful new way. Even better: Expedient becomes the FIRST Cloud Service Provider to bring this combined solution to market. This event is all about bringing our Pittsburgh-area community together to learn, connect, and celebrate a major milestone in the hybrid cloud and on-prem cloud ecosystem. What You’ll Experience A deep dive into the new Pure Storage + Nutanix integration How Expedient is delivering it as a fully managed cloud service Real-world use cases for cloud-smart modernization Customer-driven conversation, not vendor slides Networking with peers, experts, and the local PUG community Food, drinks, and launch-party fun Why This Matters This three-way partnership brings customers NVMe-fast, always-on performance Effortless scalability and hybrid cloud freedom A cloud service built for simplicity and resiliency Lower operational overhead no firefighting, no forklift upgrades It’s the stack that “just works,” so your teams can focus on innovation instead of maintenance. REGISTER NOW -->120Views1like0CommentsPure User Group | Denver: Winter Meetup Sponsored by Nutanix
Don’t miss our Pure User Group (PUG) Winter Meetup—now sponsored by Nutanix—on Thursday, October 16th from 3:00–5:00 PM MT at Tavern 242 in Morrison, CO. Whether you’re a longtime Pure Storage customer or just joining the community, this is your chance to connect with fellow users, hear about the future of Pure Storage, and discover what’s next in the world of hypervisors—all while enjoying local brews and getting your skis waxed and prepped for the season. What to Expect: Learn about Pure Storage’s vision for next-generation hypervisors: how we’re approaching modern virtualization, empowering choice and agility, and driving new capabilities for data resilience and performance. Special introduction to Nutanix’s hypervisor technology—discover how Pure Storage and Nutanix are partnering to enable seamless, high-performance virtualization solutions for organizations of every size. Enjoy drinks, snacks, open networking with Pure Storage and Nutanix experts, PLUS on-site ski waxing and prep to get you ready for winter on the slopes. In partnership with: Nutanix Date & Time October 16, 2025 3:00 PM – 5:00 PM MT Location Tavern 242 4285 S. Eldridge St., Unit A 09 Morrison, CO 80465 Register Now!89Views0likes0CommentsThe Spokane Pure User Group is coming Sept. 22nd
Join us for a dynamic Pure Storage Customer Event on September 22 at Uprise Brewing Co! This exclusive gathering brings together Pure Storage experts, featured speakers, and fellow customers for an afternoon packed with learning, networking, and fun. Explore our featured talk tracks: Accelerate Updates: Get an inside look at the latest enhancements to Pure Accelerate. Our experts will share what’s new, showcase the most impactful features, and provide guidance on how to take advantage of these updates to drive innovation and boost operational efficiency within your organization. Virtualization & Cloud: Dive deep into best practices for optimizing your virtualization stacks and seamlessly integrating Pure Storage with today’s cloud environments. Learn strategies for simplifying management, ensuring data protection, and enabling flexible, high-performance hybrid cloud solutions. Space is limited, so mark your calendar for September 22, 3:00 PM – 5:30 PM, and get ready for an engaging, informative event you won't want to miss. We look forward to seeing you at Uprise Brewing Co. Register Now!97Views2likes0CommentsHow can I help?
Hey Pure Community, I’m Brian Heck, and I get to be your Senior Systems Engineer if you’re in Alaska, Washington, or Oregon (yup, that’s me—the person behind the emails!). I’m part of the SLED team at Pure, which just means I spend my days helping schools, local governments, and all sorts of public organizations like Tribal make the most of their data. I’ve always believed the secret sauce at Pure isn’t just our tech (though I’m pretty biased about how rock solid flash storage is). It’s actually the people on our team and in this community. There’s zero ego, just lots of curiosity and a drive to solve real problems for real folks. So if you ever just want to ask a question, vent about a challenge, or swap stories about your favorite upgrades, trust me, you’re speaking my language. If you’re wondering: What’s it really like working at Pure (spoiler: the culture here rocks) What new tech or trends I’m excited about in storage, cloud, or AI How to get the most out of your SE team (or what the heck SEs actually do behind the scenes) …please shout! I love sharing tips, diving into rabbit holes, and figuring out better ways to do things together. I’m always up for good conversation, honest feedback, or brainstorming sessions, whether it’s in the forums or over coffee (virtual or real). This community means a lot to me, and I’d really love to hear your stories, see your questions, and learn from your experiences. I've been a part of the VMUG for quite awhile, so things like this are my jam. I’ll do my best to share the good stuff—tech advice, a peek at life at Pure, and maybe a few dad jokes if you’re lucky. How can I help? What’s on your mind?75Views1like0Comments