• DataManagement.AI
  • Posts
  • Noninvasive Data Governance: Simplifying Compliance and Accountability

Noninvasive Data Governance: Simplifying Compliance and Accountability

Noninvasive governance, by contrast, aligns governance responsibilities with existing tasks, reinforcing the importance of data policies without overwhelming employees.

Noninvasive Data Governance: Simplifying Compliance and Accountability

Successful data usage in any organization requires a strong data governance program, a responsibility shared across many roles. Robert Seiner developed a “noninvasive” approach to make data governance duties more accessible.

Often, data governance is perceived as complex and time-intensive due to management overhead. Employees may feel burdened by additional responsibilities, leading to resistance. However, Seiner’s noninvasive method reframes governance as tasks employees are already handling in an informal or less efficient way.

In Non-Invasive Data Governance Strikes Again: Gaining Experience and Perspective, Seiner explains, “People are defining, producing, and using data as part of their job. If we hold these people formally accountable, they become stewards of the data.” This approach formalizes existing roles, shifting from informal to formal accountability.

Seiner begins his book by sharing his experiences with data governance and outlining his noninvasive framework, which contrasts with two common models: command-and-control and traditional governance.

Traditional governance assigns management the responsibility of engaging employees with governance tools, integrating governance roles into current workflows. The command-and-control model is top-down, with leadership assigning governance duties, which can feel like an additional workload for employees.

Noninvasive governance, by contrast, aligns governance responsibilities with existing tasks, reinforcing the importance of data policies without overwhelming employees.

“I suggest using a Non-Invasive Data Governance approach to formalize what’s already in place and address areas for improvement, without making it seem daunting or disruptive,” Seiner writes.

The book includes a framework that maps key governance components across organizational levels. Seiner details essential elements for a noninvasive governance strategy, including organizational roles, business value demonstration, behavior’s role in governance success, and required technology.

This is Seiner’s second book on the topic; his first, Non-Invasive Data Governance: The Path of Least Resistance and Greatest Success, was published in 2014. His latest work builds on a decade of insights gained from helping organizations implement noninvasive governance. You can read an excerpt from Chapter 1 or visit Technics Publications to learn more or purchase the book.

Peter Spotts, site editor for SearchBusinessAnalytics and SearchDataManagement, manages content on these topics. He previously worked as a writer and editor for Turley Publications in Western Massachusetts.

How NetApp is Shaping the Future of Intelligent Data Management?

To tackle business challenges and enable new applications, data alone isn’t enough; intelligence and security are crucial to prevent data from becoming vulnerable to cyberattacks. This insight was shared by Harv Bhela, Chief Product Officer at NetApp, during his presentation at NetApp INSIGHT 2024 in Las Vegas, where he highlighted five key areas of innovation that NetApp is focusing on to meet customers' most pressing needs.

“The pace of change is rapid for our customers. They need us to innovate quickly so they can maximize the value of their data, transforming it into a competitive edge,” Bhela said.

Unified Data Storage

In the area of unified data storage, NetApp introduced its ASA A-Series storage systems, aimed at simplifying storage management and meeting block storage requirements. Additionally, NetApp enhanced its Data Infrastructure Insights service (formerly Cloud Insights) for improved monitoring and analysis of the ASA platform, offering greater visibility, optimization, and reliability for customers’ data infrastructure.

“Unified data storage is at the core of NetApp’s mission. We provide performance and design options at various price points, all built on a single architecture and platform that streamlines management for our customers,” Bhela explained.

Cloud Storage

NetApp is also integrated with major cloud providers—AWS, Google Cloud, and Microsoft Azure—supporting both on-premises and cloud-based workloads. Bhela revealed that NetApp is adding new performance tiers and price options, making it easier for customers to move a range of workloads to the cloud, from high-performance to more routine tasks.

As generative AI becomes increasingly central to business, NetApp is enhancing its cloud capabilities to support customers’ AI-driven initiatives. “AI typically involves hybrid setups, with training happening in the cloud while data remains on-premises. NetApp offers a hybrid multi-cloud workflow to support these AI processes seamlessly,” Bhela added.

Anti-Ransomware

Ransomware remains a top concern across industries, Bhela observed. A security breach not only disrupts operations but also harms brand reputation. In response, NetApp has developed anti-ransomware features within its storage platform, acting as a final defense against attacks.

Bhela explained that NetApp’s storage includes ML-based models to detect anomalies that may signal ransomware attacks. When an anomaly is detected, customers can trigger workflows to stop the attack, create snapshots, or take other defensive steps.

Some attackers move slowly, encrypting data over time, which can complicate recovery. To address this, NetApp’s ONTAP Autonomous Ransomware Protection with AI (ARP/AI) solution uses automated workflows to identify reliable backups, helping customers restore clean data.

“As a data company, not a security firm, we inform customers when something is wrong, aid in recovery, and help them build workflows to minimize downtime and restore operations,” Bhela said.

Enterprise AI

Scaling AI across all workloads is challenging, especially with unstructured data. “Many customers aren’t sure which data to use, where it’s stored, or whether it includes personal identifiable information (PII),” Bhela noted.

To support responsible AI, NetApp announced several initiatives to assist enterprises on their AI journey:

  • Nvidia Certification: NetApp is undergoing Nvidia certification for its ONTAP storage on the AFF A90 platform to support large-scale AI projects.

  • Global Metadata Namespace: NetApp is creating a global metadata namespace to securely manage data across hybrid multi-cloud environments, enabling data classification and AI feature extraction.

  • AI Data Pipeline Integration: The ONTAP operating system will support automated data preparation for AI, capturing data changes, classifying and anonymizing data, and generating vector embeddings for efficient search and retrieval-augmented generation (RAG).

  • Disaggregated Storage Architecture: NetApp aims to increase network and flash speed utilization and reduce infrastructure costs with shared back-end storage.

  • Centralized Data Platform: NetApp is developing a centralized platform to ingest, discover, and catalog data, integrate with data warehouses, and offer tools for data visualization and transformation. Planned integration will allow Google Cloud NetApp Volumes to serve as a data source for BigQuery and Vertex AI.

Flexible Consumption

Bhela emphasized the importance of flexible consumption models to meet the needs of organizations of all sizes. “We allow customers to choose how they consume our products, whether by purchasing upfront, using storage as a service, or buying through partners,” he said.

Looking ahead, Bhela reaffirmed that NetApp’s product strategy will continue focusing on customer needs. “Our goal is to democratize enterprise AI by making NetApp storage the foundation for AI in enterprises worldwide,” he concluded.

How Sovereign Cloud Helps CIOs Overcome Data Management Hurdles?

CIOs aiming to help their organizations comply with data regulations in today’s rapidly evolving digital landscape should consider the benefits of the sovereign cloud. This was a key takeaway from VMware Explore in Barcelona, where IT leaders reviewed VMware’s strategic roadmap following its acquisition by Broadcom. A major focus of discussions was on sovereign cloud, which VMware defines as cloud services that comply with the legal requirements of a specific country.

David Michels, a researcher from Queen Mary University of London, participated in an industry panel on sovereign cloud, noting rising demand for these services, particularly when combined with hyperscale providers. He explained that the push for sovereign cloud is driven by both internal and external factors: internal drivers involve companies’ commercial or operational reasons, while external drivers are regulatory requirements.

To address these needs, VMware offers its VMware Cloud Foundation (VCF), a private cloud platform designed to support sovereignty and security. During his keynote, Broadcom CEO Hock Tan emphasized the importance of sovereign cloud, explaining that “VCF provides the ultimate sovereign cloud solution,” allowing companies to work with a national cloud provider that meets local compliance standards.

Understanding Cloud Sovereignty

Michels explained cloud sovereignty by breaking it down into three main areas:

  1. Control: Cloud sovereignty gives customers control over resources, including who can access stored data and the purposes for which it can be used.

  2. Sovereignty vs. Data Residency: While data residency refers to the physical location of servers, sovereignty goes further. US-based hyperscale providers may offer localized services in Europe, but they are still subject to US jurisdiction, meaning European customer data could be accessed by US authorities. Thus, data residency alone does not ensure sovereignty.

  3. Data Control Issues: Sovereignty also involves concerns like vendor lock-in, data portability, security, and encryption. Michels pointed out that IT professionals should consider if their provider can protect data from unauthorized access by foreign governments. European providers, for example, are generally not subject to US jurisdiction, allowing them to offer a truly sovereign cloud.

Managing Regulatory Requirements

Roger Joys, VP of Enterprise Cloud Platform at Alaska-based GCI, shared his experience on how regulations around call data and customer information drive the need for a sovereign cloud, which provides a secure space for sensitive data.

According to Michels, compliance with regulations like GDPR and EU cybersecurity laws creates external pressure for organizations to adopt sovereign cloud. Michels noted that companies are increasingly adopting hybrid cloud strategies, where some workloads are best suited for hyperscale clouds due to scalability needs, while others benefit from private clouds with local providers.

Queen Mary’s research also found that the main internal motivation for sovereignty is protecting data from foreign government access, relevant for sectors like defense, intelligence, and research in fields such as AI or quantum computing.

Building a Secure Research Platform

Keith Woolley, Chief Digital and Information Officer at the University of Bristol, explained that strict export rules require the university to demonstrate compliance in data security, especially for international research collaborations. As the university aspires to lead in AI research, Woolley’s team is exploring a secure, multi-tenant AI environment that could allow researchers to safely develop AI language models without risking data misuse by tech firms.

Woolley noted that while the university’s research capabilities are a unique strength, they must also ensure their AI environment is secure and compliant, working with Broadcom to meet these sophisticated research needs.