Modern data platforms continue to grow in complexity to meet the changing needs of data consumers. Data analysts and data scientists want faster access to data, but IT, security, and governance are stalled, providing access to data in a simple, secure, and standardized way across a variety of analytical tools. I don’t understand how to do it.
In fact, by 2022, according to Gartner, only 20% of organizations investing in information governance will succeed in expanding their digital business. As a result, organizations are designing data access frameworks that can overcome data delivery challenges, maintain scalability, and ensure universal data authentication across all parties.
Why modern data platforms are so complex
Organizations of all sizes continue to leverage data to better understand their customers, gain competitive advantage, and improve operational efficiency. To meet these needs, an enterprise data platform that can handle the complexity of managing and using data is essential.
One of the biggest challenges facing today’s data platform teams is increasingly making data universally accessible from a variety of different storage systems (data lakes, data warehouses, relational databases, etc.). A way to meet complex data governance and compliance requirements. Privacy laws such as GDPR and CCPA.
This complexity is exacerbated by disconnections between groups of data stakeholders. Technical data platform and data architecture team. Centralized data security and compliance. Data scientists and analysts sitting in business areas chartered by creating insights. Data owners and stewards responsible for building new data products.
Without proper data access and approval frameworks to help automate processes, the complexity of managing customer data and personal information (PII) can significantly impact productivity and limit the amount of data available.
How to Establish Cloud-Based Data Security and Regulatory Compliance
If the data stakeholders are not working together, the organization will get stuck in the process of delivering the data. This is because data consumers need to find the right dataset, understand its context, trust its quality, and access it with the tools of their choice. In the meantime, the data security and governance team must be trusted to apply the correct data authentication. And governance policy.
To reduce the time to data platform insights, you need a robust framework that not only meets the needs of all stakeholders, but also provides the ability to scale as your system grows.
When designing or designing a solution that guarantees responsible data use, it is important to develop a universal data approval framework that includes the following six key features:
1. Leverage attribute-based access control (ABAC)
Most organizations use role-based access control (RBAC) to initiate access control policy creation. While this approach is useful for simple use cases, roles are manually inherently static, so for each new use case you need to create a new role with new permissions for that user.
As the size and complexity of data platforms grows, the result is a painful policy environment called “explosive growth of roles.” In addition, each system has its own standards for defining and managing permissions for roles, and RBAC is often restricted to broad access (for example, entire tables or files).
Alternatively, ABAC allows organizations to leverage the attributes of multiple systems to define dynamic data approval policies and make context-sensitive decisions for individual access requests.
ABAC, a superset of RBAC, supports the complexity of detailed policy requirements and is used by more people through three major categories of attributes (users, resources, environment) that can be used to define policies. You can extend the data access to the case.
2. Dynamically apply access policies
Most of the existing solutions for policy enforcement still need to maintain multiple copies of each dataset, and the cost of creating and maintaining these can quickly add up. Simply defining a policy with ABAC cannot completely reduce the pain, especially if the attribute is evaluated against the access policy at the decision point. This is because they still point to static copies.
Once you’ve completed the demanding task of defining attributes and policies, you can push them to the enforcement engine to edit columns or data such as anonymization, tokenization, masking, and even advanced techniques such as: You need to dynamically filter and transform your data by applying transformations. Differential privacy.
Dynamic enforcement is the key to increasing the granularity of access policies without increasing the complexity of the overall data system. It is also the key to ensuring that an organization remains strong in responding to changing governance requirements.
3. Create an integrated metadata layer
If the engine required to drive scalable and secure data access is ABAC, the metadata will be the fuel for the engine. It is necessary to visualize the contents and location of your organization’s dataset and to build attribute-based access control policies. A richer layer of metadata also allows organizations to use it to create more detailed and relevant access policies.
There are four important areas to consider when designing the metadata life cycle.
- access: How do I enable seamless access through the API to leverage metadata for policy decisions?
- Unification: How do I create an integrated metadata layer?
- Metadata drift: How do you ensure that your metadata is up to date?
- discover: How can you discover new technology and business metadata?
The challenge is that metadata, like data, usually resides in multiple locations within an enterprise and is owned by different teams. Each analytics engine requires its own technology metastore, but the governance team maintains business context and classification within business catalogs such as Collibra and Alation.
Therefore, organizations need to consolidate and consolidate metadata to provide a complete set of governance and access control policies in real time. In essence, this integration is done through the abstraction layer, as it is unreasonable and almost impossible to expect metadata to be defined in one place.
Continuous integration of metadata establishes the single source of truth about the data. This avoids “metadata drift” or “schema drift” (data management inconsistencies) over time, enabling effective data governance and business processes such as data classification and tagging across the organization. I can do it. It also establishes a unified data classification, facilitating data detection and user access.
Metadata management tools that use artificial intelligence to automate parts of the metadata life cycle also identify sensitive data types and apply appropriate data classifications, automate data detection and schema inference, and automatically detect metadata drifts. It is convenient because you can perform the tasks of.
4. Enable distributed stewardship
Scaling secure data access is not just a matter of scaling policy types and enforcement methods. The types of data available and the business requirements needed to leverage them are so diverse and complex that the policy decision-making process also needs to be extensible.
Lack of access models and user experiences that allow non-technical users to manage these policies extends access control, just as enforcement engines can become bottlenecks if not properly designed. It interferes with the ability of the organization to do so.
Effective data access management must strive to embrace the unique needs of all components and do not interfere with them. Unfortunately, many access management tools require complex change management and the development of custom processes and workflows. Enterprises need to ask early on how this access model adapts to their organization.
To enable distributed stewardship, the access system must support two key areas. First, delegate data and access policy management to line-of-business people (data stewards and administrators) who understand data or governance requirements and replicate centralized governance standards across groups within the organization. , Allows changes to be propagated consistently throughout. Organization.
5. Ensure a simple centralized audit
Knowing where sensitive data is, who has access to it, and who has the authority to access it is important to enable intelligent access decisions.
This is because editing is a consistent challenge for governance teams, as modern enterprise environments do not have a single standard that spans different tools. Audit logs are difficult and cannot be scaled across different systems so that governance teams can answer basic questions.
Even though the governance team has set the policy to the top level, there is no easy way to understand whether the policy is applied when accessing the data and whether the organization’s data is actually protected.
Centralized auditing with a consistent schema is important for generating data usage reports, and you can enable automated data breach alerts through a single integration with Enterprise SIEM. Because many log management solutions focus on application logs, organizations also want a solution that audits the log schema so that governance teams can answer audit questions.
Another consideration is to invest in basic visibility mechanisms early in the data platform journey to help data stewards and governance teams understand how data is used and demonstrate the value of the platform. is. Knowing what data your business has and how people use it allows teams to design more effective access policies around it.
Finally, look for a flexible API-driven architecture to ensure that access control frameworks are available in the future and can adapt to the needs of your data platform.
6. Future-oriented integration
As data sources and tools evolve, data platforms can change over time, so integrating with your organization’s broader environment is a key factor in the success of your access control approach. Similarly, access control frameworks need to be adaptable and support flexible integration across the data fabric.
One of the benefits of using ABAC for access control is that you can retrieve attributes from existing systems in your organization if you can retrieve them in a high-performance way to make dynamic policy decisions.
Building a flexible foundation eliminates the need for organizations to understand the entire architecture from the beginning. Instead, you can start with a few key tools and use cases and add more as your organization understands how to use the data.
After all, policy insights are continuous, and interesting insights are in the duplication of important questions such as what sensitive data is. Who is accessing it and why? Who needs access?
Some organizations choose to focus on open source for this reason, as they have the option to customize the integration to suit their needs. However, an important consideration is that building and maintaining these integrations can quickly become a full-time job.
In an ideal scenario, the data platform team should be lean and have low operational overhead. Investing time in engineering and maintaining integration is unlikely to make a difference to your organization, especially if there are some high-quality integration tools in your ecosystem.
Successful universal data authentication
As with any major initiative, it’s important to take a step back and take advantage of the design-to-value approach when trying to protect data access. This means finding the most valuable data domains that need access to sensitive data and enabling or unblocking them first. It also seeks to establish visibility into how the data is currently being used to prioritize actions.
Organizations are investing heavily in data platforms to unleash new innovations. However, without the underlying framework, data efforts will continue to be blocked at the last mile.
Scaling secure and universal data approvals can significantly improve agility within an organization, but by leveraging the above six principles, organizations are always ahead of the curve and all. You can design a framework that provides the right foundation for your stakeholder success.
https://www.technewsworld.com/story/6-critical-steps-for-scaling-secure-universal-data-authorization-176742.html?rss=1 Six key steps to scale secure universal data authentication