One data security platform for data protection and access control
Our user-friendly SaaS platform makes it easy for data and infosec teams of all sizes to secure regulated data immediately.
One powerful platform for your compliance team
infosec team
data team
compliance team
infosec team
data team
Heading 1
Heading 2
Heading 2
Dynamic Data Masking
Database Activity Monitoring
Data Classification
Tokenization
Open Source Integrations
Format Preserving Encryption
Simplified data security and data access governance
Data teams get real-time access to sensitive data without risk while security teams gain full visibility over sensitive data. All data functions are fully aligned.
Non-technical users can implement policy and simplify ownership so data can remain streamlined and automated.
Protection for at rest, in motion, and in use data. Remove the risk of access threats by extending governance and security upstream and to the left.
Data teams
Quickly identify and protect sensitive data with our automated classification tools and govern high-risk values to meet compliance needs.
Set up automated access controls with our dynamic data masking capabilities to ensure proper access to credentialed users.
Data teams
Quickly identify and protect sensitive data with our automated classification tools and govern high-risk values to meet compliance needs.
Set up automated access controls with our dynamic data masking capabilities to ensure proper access to credentialed users.
Infosec teams
Achieve real-time observability over how sensitive data is consumed in the cloud with active alerts for unauthorized requests.
Secure data from source to cloud with automated data access governance so sensitive data is never at risk.
Infosec teams
Achieve real-time observability over how sensitive data is consumed in the cloud with active alerts for unauthorized requests.
Secure data from source to cloud with automated data access governance so sensitive data is never at risk.
Data teams
Quickly identify and protect sensitive data with our automated classification tools and govern high-risk values to meet compliance needs.
Set up automated access controls with our dynamic data masking capabilities to ensure proper access to credentialed users.
Infosec teams
Achieve real-time observability over how sensitive data is consumed in the cloud with active alerts for unauthorized requests.
Secure data from source to cloud with automated data access governance so sensitive data is never at risk.
Providing real solutions
Blog
In a world where data breaches and privacy threats are the norm, safeguarding sensitive information is no longer optional—it's critical. As regulations tighten and privacy concerns soar, our customers are demanding cutting-edge solutions that don't just secure their data but do so with finesse. Enter Format Preserving Encryption (FPE). When paired with ALTR's capability to seamlessly share encryption keys with trusted third parties via platforms like Snowflake's data sharing, FPE becomes a game-changer.
Understanding Format Preserving Encryption (FPE)
Format Preserving Encryption (FPE) is a type of encryption that ensures the encrypted data retains the same format as the original plaintext. For example, if a credit card number is encrypted using FPE, the resulting ciphertext will still appear as a string of digits of the same length. This characteristic makes FPE particularly useful in scenarios where maintaining data format is crucial, such as legacy systems, databases, or applications requiring data in a specific format.
Key Benefits of FPE
Seamless Integration
FPE maintains the data format, allowing easy integration into existing data pipelines without requiring significant changes. This minimizes the impact on business operations and reduces the costs associated with implementing encryption.
Compliance with Regulations
Many regulatory frameworks, such as the GDPR, PCI-DSS, and HIPAA, mandate the protection of sensitive data. FPE helps organizations comply with these regulations by ensuring that data is encrypted to preserve its usability and format, which can sometimes be a requirement in these standards.
Enhanced Data Utility
Unlike traditional encryption methods, FPE allows encrypted data to be used in its existing form for specific operations, such as searches, sorting, and indexing. This ensures organizations can continue to derive value from their data without compromising security.
The Role of Snowflake in Data Sharing
Snowflake is a cloud-based data warehousing platform that allows organizations to store, process, and analyze large volumes of data. One of its differentiating features is data sharing, which enables companies to share live, governed data with other Snowflake accounts in a secure and controlled manner while also shifting the cost of the computing operations of the data over to the share's consumer.
Key Features of Snowflake Data Sharing
Real-Time Data Access
Snowflake's data sharing allows recipients to access shared data in real-time, ensuring they always have the most up-to-date information. This is particularly valuable in scenarios where timely access to data is critical, such as in financial services or healthcare.
Secure Data Exchange
Snowflake's platform is designed with security at its core. Data sharing is governed by robust access controls, ensuring only authorized parties can view or interact with the shared data. This is crucial for maintaining the confidentiality and integrity of sensitive information.
Scalability and Flexibility
Snowflake's architecture allows for easy scalability, enabling organizations to share large volumes of data with multiple parties without compromising performance. Additionally, the platform supports a wide range of data formats and types, making it suitable for diverse use cases.
The Power of Combining FPE with Snowflake’s Key Sharing
When FPE is combined with the ability to share encryption keys via Snowflake's data sharing, it unlocks a new level of security and flexibility for organizations. This combination addresses several critical challenges in data protection and sharing:
Controlled Access to Encrypted Data
By leveraging FPE, organizations can encrypt sensitive data while preserving its format. However, there are scenarios where this encrypted data needs to be shared with trusted third parties, such as partners, auditors, or service providers. Through Snowflake's data sharing and ALTR's FPE Key Sharing, companies can securely share encrypted data along with the corresponding encryption keys. This allows the third party to decrypt the data within the policies that they have defined and use it as needed.
Data Security Across Multiple Environments
In a multi-cloud or hybrid environment, data often needs to be moved between different systems or shared with external entities. Traditional encryption methods can be cumbersome in such scenarios, as they require extensive reconfiguration or critical management efforts. However, with FPE and Snowflake's key sharing, organizations can seamlessly share encrypted data across different environments without compromising security. The encryption keys can be securely shared via Snowflake, ensuring only authorized parties can decrypt and access the data.
Regulatory Compliance and Auditing
Many regulations require organizations to demonstrate that they have implemented appropriate security measures to protect sensitive data. By using FPE, companies can encrypt data that complies with these regulations. At the same time, the ability to share encryption keys through Snowflake ensures that data can be securely shared with auditors or regulators. Additionally, Snowflake's robust logging and auditing capabilities provide a detailed record of who accessed the data and when which is essential for compliance reporting.
Enhanced Collaboration with Partners
In finance, healthcare, and retail industries, collaboration with external partners is often essential. However, sharing sensitive data with these partners presents significant security risks. By combining FPE with ALTR's key sharing, organizations can securely share encrypted data with partners, ensuring that sensitive information is transmitted throughout the data's lifecycle, including across shares. This enables more effective collaboration without compromising data security.
Efficient and Secure Data Processing
Specific data processing tasks, such as data analytics or AI model training, require access to large volumes of data. In scenarios where this data is sensitive, encryption is necessary. However, traditional encryption methods can hinder the efficiency of these tasks due to the need for decryption before processing. With FPE, the data can remain encrypted during processing, while ALTR's key sharing allows the consumer to decrypt data only when absolutely necessary. This ensures that data processing is both secure and efficient.
Use Cases of FPE with ALTR Key Sharing
To better understand the value of combining FPE with ALTR's key sharing, let's explore a few use cases:
Financial Services
In the financial sector, organizations handle a vast amount of sensitive data, including customer information, transaction details, and credit card numbers. FPE can encrypt this data while preserving its format, ensuring it can still be used in legacy systems and applications. Through Snowflake's data sharing, financial institutions can securely share encrypted transaction data with external auditors, partners, or regulators, along with the necessary encryption keys. This ensures compliance with regulations while maintaining the security of sensitive information.
Healthcare
Healthcare organizations often need to share patient data with external entities, such as insurance companies or research institutions. FPE can encrypt patient records, ensuring they remain secure while preserving the format required for healthcare applications. Snowflake's data sharing allows healthcare providers to securely share this encrypted data with third parties. At the same time, ALTR enables the sharing of the corresponding encryption keys, enabling them to access and use the data while ensuring compliance with HIPAA and other regulations.
Retail
Retailers often need to share customer data with marketing partners, payment processors, or logistics providers. FPE can be used to encrypt customer information, such as names, addresses, and payment details while maintaining the format required for retail systems. Snowflake's data sharing enables retailers to securely share this encrypted data with their partners; with ALTR, the encryption keys are also shared, ensuring that customer information is always protected.
The Broader Implications for Businesses
The combination of Format Preserving Encryption and ALTR's key-sharing capabilities represents a significant advancement in the field of data security. This approach addresses several critical challenges in data protection and sharing by enabling organizations to securely share encrypted data with trusted third parties.
Strengthening Trust and Collaboration
In an increasingly interconnected world, businesses must collaborate with external partners and share data to remain competitive. However, this collaboration often comes with significant security risks. By leveraging FPE and ALTR's key sharing, organizations can strengthen trust with their partners by ensuring that sensitive data is always protected, even when shared. This leads to more effective and secure collaboration, ultimately driving business success.
Reducing the Risk of Data Breaches
Data breaches, including financial losses, reputational damage, and regulatory penalties, can devastate businesses. Organizations can significantly reduce the risk of data breaches by encrypting sensitive data with FPE and securely sharing it via Snowflake. Even if the data is intercepted, it remains protected, as only authorized parties with the corresponding encryption keys can decrypt it.
Enabling Innovation While Ensuring Security
As organizations continue to innovate and leverage new technologies, such as artificial intelligence and machine learning, the need for secure data sharing will only grow. The combination of FPE and ALTR's key sharing enables businesses to securely share and process data innovatively without compromising security. This ensures that organizations can continue to innovate while protecting their most valuable asset – their data.
Wrapping Up
Integrating Format Preserving Encryption with ALTR's key sharing capabilities offers a powerful solution for organizations seeking to protect sensitive data while enabling secure collaboration and innovation. By preserving the format of encrypted data and allowing for secure key sharing, this approach addresses critical challenges in data protection, regulatory compliance, and data sharing across multiple environments. As businesses navigate the complexities of the digital age, the value of this combined solution will only become more apparent, making it a vital component of any robust data security strategy.
ALTR's Format-preserving Encryption is now available on Snowflake Marketplace.
“Today is the day!” you exclaim to yourself as you settle into your desk on Monday morning. After months of meticulous planning, the migration from Teradata to Snowflake begins now. You have been through all the back-and-forth with leadership on why this migration is needed: Teradata is expensive, Teradata is not agile, Snowflake creates a single source of data truth, and Snowflake is instantly on and scales when you need it. It’s perfect for you and your business.
As you follow your meticulously planned checklist for the migration, you're utilizing cutting-edge tools like DBT, Okta, and Sigma. These tools are not just cool, they're the future. You're moving your database structure, loading the initial non-sensitive data, repointing your ETL pipelines, and witnessing the power of modern technology in action. Everything is working like a charm.
A few weeks or months of testing go by, your downstream consumers of data are still using Teradata but are starting to give thumbs up on the Snowflake workloads that you have already migrated. Things are going well. You have not thought about CPU or disk space for the Teradata box in a while, which was the point of the migration. You finally get word from all stakeholders that this trial migration was a success! You call your Snowflake team, and tell them to back up the truck, you are clear to move the remaining workloads. Life is good. But then, comes a knock at the door.
It’s Pat from Security & Risk. You know Pat well and enjoy Pat’s company, but you also do as much as possible to avoid Pat because you are in data and, well, we all know the feeling. Pat tells you, “Heard we are finally getting off Teradata; that’s awesome! Do you have a plan for the PII and SSNs that are kept in that one Teradata database that we require using Protegrity for audit and compliance reasons?” You nod, “I do, but I couldn't do it without your expertise. I’ve been reading the Snowflake documentation, and I'm in the process of writing a few small AWS Lambdas to interface with Protegrity. Your input is crucial to this process.” Pat smiles, gives a non-assuring hand on your back and walks out. Phew, no more Pat.
Four weeks later, you're utterly exhausted. You've logged over 50 hours in Snowflake with fellow data engineers, and tapped into the expertise of one of the cloud ops team members who knows Lambda inside out. You have escalated to Snowflake support, but your external function calls from Snowflake to AWS keep timing out. AWS support is unable to help. Now, you have memory limits being hit with AWS Lambda. Suddenly, the internal network team does not want to keep the ports open to hit Protegrity from AWS, and you need to use a Private Link connection with additional security controls. You are behind on the Teradata migrations. There is no end in sight of the scale problems. Shoot, this is not working.
Don’t worry, you are not alone. This is the same experience felt by hundreds of Snowflake customers, and it stems from the same problem: everything about your Snowflake migration was planned for the new architecture of Snowflake except for one thing: data protection. You followed all the blogs and user guides, and your stateless data pipeline feeding Snowflake with a Kafka bus is perfect. Sigma is running without limits. The team is happy, but they want that customer data now. Except, you can’t use it until you solve this security problem.
Snowflake and OLAP workloads, generally, turned data protection on its head. OLTP workloads are easy to secure. You know the access points and the typical pattern of user behavior, so you can easily plan for scale and up-time. OLAP is widely unpredictable. Large queries, small queries, ten rows, 10M rows, it’s a nightmare for security. There is only one path forward: you must get purpose-built data protection for Snowflake.
You need a data protection solution that matches Snowflake’s architecture, just like when you matched Protegrity to Teradata. If Snowflake is going to be elastic, your data protection needs to be elastic. If Snowflake is going to be accessed by many downstream consumers, you need to be able to integrate data protection into the access policies in Snowflake. Who is going to do that work? Who will maintain this code? How can you control costs? The answer to all those questions is ALTR.
ALTR’s purpose-built native app for data protection is an easy solution for Snowflake. You can install it on your own. You can use your Snowflake committed dollars to pay for the service. ALTR’s data protection scale is controlled by Snowflake and nothing else. It’s the easiest way to get back on track. Call your Snowflake team, ask them about ALTR. It will feel good walking back into Pat’s office with your head held high and your data migration back on track.
Whether your team currently has Protegrity or Voltage, you will face the same problems. Do not waste your time trying to get these solutions to scale, just call ATLR.
Don’t just take my word for it…
In a world where data is the lifeblood of organizations, managing and securing that data is no longer just an IT task—it's a business imperative. Yet, despite the critical nature of data governance, many solutions out there are still bogged down by complexity, time-consuming processes, and significant risks. Enter ALTR, the cutting-edge solution that’s not just simplifying data governance but revolutionizing it. Here’s why ALTR is the game-changer your organization needs, and why it’s quickly becoming the go-to for companies leveraging Snowflake.
1. Ease of Use: Accessible Security
Data governance has long been a domain reserved for the technically savvy, with traditional methods requiring extensive SQL coding and intricate configurations. This not only made the process time-consuming but also left it vulnerable to human error. ALTR redefines this narrative with a user-centric approach. Seamlessly integrated into Snowflake and instantly accessible through Snowflake Partner Connect, ALTR is designed to be up and running in mere minutes. The intuitive interface, built on ALTR’s robust Management API, empowers even non-technical users to accomplish tasks that once required days of coding—now achievable with just a few clicks. ALTR democratizes data governance, making it fast, simple, and accessible to all, ensuring that security is no longer a complex or exclusive domain, but one that everyone can master.
2. Proof of Value & Time to Value: Immediate Impact
In the high-speed world of data, there's simply no room for delay. ALTR recognizes this need for urgency, delivering Proof of Value and Time to Value in a matter of hours and days—rather than the months or quarters typical of traditional solutions. With ALTR’s SaaS model, you can unlock its features in your Snowflake sandbox environment at no cost, letting you experience its power before making any commitments.
But ALTR doesn’t just stop at providing tools; we empower you with expertise. Our team of Field Engineers is ready to assist in crafting tailored solutions, automating processes, and ensuring seamless interoperability with your data catalogs, ETL/ELT tools, and SIEMs. Customers often leverage ALTR’s Rapid POC Framework, which accelerates the definition of use case requirements and success criteria. Over just two or three focused one-hour screenshare sessions with an ALTR Field Engineer, you’ll produce the artifacts, evidence, and performance metrics needed to confidently move toward full-scale implementation.
It’s not merely about demonstrating ALTR’s value—it’s about ensuring you realize that value at lightning speed, setting your team up for swift, scalable success.
3. Reduced Complexity: Cutting Through the Chaos
In the realm of data governance and security, complexity is the silent killer. The more convoluted your protocols, the more they drain your resources—whether that’s time, money, or manpower. ALTR was engineered to dismantle this complexity from the ground up. Managing access policies across multiple access points like SNOW SQL, BI Tools, applications, and data shares is an overwhelming task on its own. When you add the intricacies of data security within Snowflake’s ecosystem, the challenge becomes even more daunting. ALTR alleviates these burdens by enabling automation at scale and empowering users to handle these tasks with ease. By simplifying these traditionally complex processes, ALTR doesn't just reduce friction; it eliminates the barriers that have historically plagued data governance. With ALTR, complexity is no longer an obstacle—it's a thing of the past.
4. Minimized Risk: Securing Your Most Valuable Asset
Human error is the Achilles’ heel of data security. From misconfigurations to overlooked details, the potential for mistakes is vast. Recent Snowflake Security incidents serve as stark reminders of these risks. ALTR addresses this vulnerability head-on. By eliminating the need for manual SQL scripting and enabling point-and-click automation, ALTR significantly reduces the risk of human error. But it doesn’t stop there. ALTR’s advanced Data Protection features—such as Format-Preserving Encryption and Tokenization—ensure that your data is protected at rest, in transit, and even in use. Coupling this with access policy automation means your data is safe from external threats, internal misuse, and even potential risks from privileged users.
5. Interoperability: The Secret to Seamless Integration
In today’s data-driven world, interoperability isn’t just a nice-to-have; it’s essential. ALTR’s SaaS architecture is designed to work seamlessly within your existing data ecosystem and InfoSec stack, making it an ideal partner for your CISO’s peace of mind. Whether it’s leveraging Snowflake Object Tags or integrating with your SIEM, SOAR, or workflow resolution software, ALTR ensures everything works together flawlessly. By making real-time Data Activity Monitoring logs, policy alerts, and notifications available within your existing systems, ALTR takes interoperability to the next level, ensuring that your data governance is as efficient as it is secure.
Wrapping Up
ALTR is not just another data governance tool—it’s a revolution in how data is managed and secured. By focusing on ease of use, rapid proof of value, reduced complexity, minimized risk, and seamless interoperability, ALTR is setting a new standard in the industry. For companies leveraging Snowflake, ALTR is the key to unlocking the full potential of their data while safeguarding it from the ever-present threats of today’s digital landscape. In a world where data is king, ALTR is the crown. Don’t just manage your data security—master it with ALTR.
Keeping a tight grip on data access control is crucial for protecting sensitive information. However, when these systems get too complicated, they can bring about a whole host of challenges and additional risks. If you're finding that your data access control is more headache than help, it might be time to take a closer look. Let's explore ten signs that your data access control might be overly complex and explore some practical solutions to help streamline and strengthen your data security approach.
9 Signs Your Data Access Control is Out of Control
1. Frequent Configuration Errors
Are you experiencing persistent configuration errors? This may indicate an overly complex data access control system. These errors often arise from the intricate setup and continuous and manual adjustments needed to manage permissions. When systems require detailed and specific configurations, even minor mistakes can lead to significant vulnerabilities. Frequent misconfigurations are a security risk and a drain on resources, necessitating constant oversight and corrections.
2. Slow Response Times
Is your team struggling to respond to access requests or security incidents promptly? This suggests your system is too convoluted. The more complex the system, the harder it is for security teams to act swiftly and efficiently. Complex workflows and multiple layers of approval can slow down response times, increasing the risk of security breaches going undetected or unaddressed for extended periods.
3. High Maintenance Costs
Excessive resources spent on maintaining and updating access controls indicate unnecessary complexity. High maintenance costs often stem from the need for specialized training and continuous updates to keep the system running smoothly. These costs add up quickly, diverting funds and constrained resources from other critical areas, making the system financially unsustainable over the long term.
4. Integration Challenges
Are you using multiple tools to manage access control? This can create redundancies, integration issues and make the system harder to manage and more expensive to maintain. Each tool requires its own configuration, management, and monitoring, adding layers of complexity that can overwhelm security teams.
5. Ineffective Monitoring
Is your security team struggling to monitor access in real-time? This could be a sign of system complexity and can lead to undetected breaches and delayed responses. Complex systems generate vast amounts of data, making it challenging to filter out critical security alerts from the noise. Ineffective real-time monitoring can result in slow threat detection and response times, increasing the risk of significant security incidents.
6. Inconsistent Policies
Wide variations in access control policies across different parts of the organization can lead to security gaps and enforcement inconsistencies. Ensuring a unified security approach becomes challenging when other departments or teams use different policies. Attackers who look for weak spots in the security fabric can exploit this inconsistency.
7. Difficulty in Auditing and Compliance
Are you struggling to conduct regular audits and ensure compliance with industry regulations? This could indicate that your access control processes are too complex. The intricate nature of these systems often requires specialized knowledge to navigate and assess, making compliance audits time-consuming and costly. Non-compliance can expose the organization to legal and financial risks, including fines and reputational damage.
8. High Incidence of Insider Threats
Complex access controls can make monitoring and restricting insider access difficult, leading to a higher incidence of insider threats. Insiders who already have a level of trusted access can exploit overly complex systems to bypass security measures or access unauthorized data. The difficulty in tracking and managing insider activities in such environments exacerbates this risk.
9. User Frustration and Low Productivity
Are users struggling to get the access they need to data? This indicates overly complex access controls, which can decrease productivity and lead to frustration. This can also lead to users seeking workarounds, such as using unauthorized methods to access data, which further compromises security.
What to Look for in a Data Security Platform (DSP)
Selecting the right Data Security Platform (DSP) is crucial for effectively managing data access control and safeguarding sensitive information. Here are the key attributes to consider when choosing a DSP:
Sensitive Data Discovery
A robust DSP should offer automated tools for quickly identifying and classifying sensitive data. This capability ensures that high-risk data is discovered and protected promptly, meeting compliance requirements. Automated classification tools help streamline the identification process, reducing the manual effort involved and ensuring that all sensitive data is accounted for and adequately secured.
Automated Access Controls
Look for a DSP that allows you to set up automated access controls with dynamic data masking capabilities. These controls ensure that only credentialed users can access sensitive information, minimizing the risk of unauthorized access. They also help maintain security policies consistently across the organization, reducing the potential for human error and enhancing overall data protection.
Real-time Data Activity Monitoring
Effective DSPs provide real-time observability over how sensitive data is consumed in the cloud. This includes active alerts for unauthorized requests, allowing immediate response to potential security breaches. Real-time data activity monitoring is essential for maintaining an up-to-date security posture and ensuring that any suspicious activity is detected and addressed promptly.
Integrated Data Security
Choose a DSP that offers integrated data security from source to cloud. Automated data access governance ensures that sensitive data is never at risk, providing comprehensive protection throughout its lifecycle. Integrated security measures help unify the approach to data protection, ensuring that all aspects of data security are covered and reducing the complexity involved in managing multiple security tools.
User-Friendly Policy Implementation
A good DSP should allow non-technical users to implement policies and simplify data ownership. This ensures that data security processes can remain streamlined and automated without requiring extensive technical knowledge. User-friendly interfaces and straightforward policy implementation tools enable broader participation in data governance, helping to maintain consistent security practices across the organization.
Wrapping Up
Managing data access control is vital for protecting sensitive information, but complexity can create numerous risks and challenges. By recognizing the signs and choosing the right Data Security Platform (DSP), you can create more robust and manageable data security environment.
In a previous post, Jonathan Sander details the primary differences between a Data Security Posture Management (DSPM) solution and a Data Security Platform (DSP). He highlights that the most notable difference between a DSPM and a DSP is in the “policy definition and policy enforcement” aspects of a DSP. He explains that while some applications allow for simple API calls to manage access or security policies, such as removing a user’s group membership in Active Directory, implementing policy definition and enforcement at a deeper level for platforms like Snowflake becomes exceedingly challenging, if not impossible, for a DSPM.
Recent events have reignited my interest in understanding how ALTR distinguishes itself from a DSPM. The first event was the potential acquisition of Wiz by Google. Wiz, a cloud security posture management (CSPM) tool, is often confused with a DSPM. This has led customers to inquire about the differences between CSPM and DSPM and, subsequently, the distinctions between DSPM and DSP. Although the Wiz/Google deal fell through, it sparked an insightful discussion on Linkedin initiated by Pramod Gosavi from JupiterOne. I participated in this discussion, which delved into why Google should reconsider buying a tool like Wiz.
The other recent event that brings DSPM v DSP back into spotlight is the word ‘remediation’, which has been used by some DSPM providers lately. The word remediation in this context indicates a DSPMs ability to react to one of their findings. For example, a remediation might be removing a user’s access from a system or making a public-facing internet resource private. These types of remediations are simple and straightforward and should easily be achievable by a DSPM. But lately, some of the DSPM players have been making mention of remediations for platforms like Snowflake stating their platforms can do complex operations such as RBAC, data masking, and data security such as encryption or tokenization. This is where the analogy "All squares are rectangles, but not all rectangles are squares" comes in handy. In this scenario, the DSPM is the square, and the DSP is the rectangle. A DSP can perform all the functions of a DSPM, but a DSPM cannot perform all the functions of a DSP. Let me explain.
The largest difference between a DSPM and a DSP is not the type or number of data stores supported, or the workflows within the platforms, but rather the biggest difference is the integration methods with the data stores. DSP’s live in the line of fire. We sit in the hardest place a vendor can sit, in the critical path of data. It’s the only way a DSP can provide capabilities like real-time database activity monitoring (DAM), data encryption or tokenization, data loss prevention, and others. Without this position in the stack, our ability to stop, or remediate, an out of policy data access request is minimized.
DSPM’s on the other hand do not live in the critical path of data access. They often exist outside the normal access patterns connecting to systems such as databases or file shares without fear of latency or uptime. A DSP has the unfortunate burden of having to essentially match the uptimes and availability of the platforms they control, often requiring significant investments in engineering and operations that DSPM do not have. It's these requirements of uptime, throughput, and strict performance metrics that make it nearly impossible for a DSPM to offer value over a DSP when it comes to complex operations in a platform like Snowflake. Since a DSP is already in line with the systems they are controlling and protecting, it is conceivable that a DSP could offer a wide overlap of the features of a DSPM, if it wanted to.
For customers, this means taking the time to understand the specific challenges you need to address for platforms like Snowflake, particularly regarding access controls and security. The multiple layers of roles and attributes assigned to users, the vast amount of data that moves and transforms inside the Snowflake platform daily, and the performance requirements of encryption on your downstream application is complex. These are hard problems for any business. And solving these challenge is what is going to fully unlock the value of your Snowflake instance.
Wrapping Up
Be cautious of any DSPM that claims to solve the complex governance and security challenges of Snowflake effortlessly. Always request detailed case studies to validate their claims. While it's not necessarily impossible, these claims often resemble a square trying to fit into a rectangle.
Data is the fuel propelling modern business. From customer information to financial records, proprietary data forms the foundation upon which businesses operate and innovate. However, as companies grow and data volumes explode, securing this data becomes exponentially more complex. This is where the importance of scalability in data security comes into sharp focus.
The Scalable Security Imperative
Scalability in data security is not a luxury; it is a necessity. As organizations expand, they generate and collect vast amounts of data. This growth demands a data security solution that can scale seamlessly with the volume, velocity, and variety of data. Organizations expose themselves to heightened risks, increased vulnerabilities, and potential catastrophic breaches without scalable security measures.
Core Pillars of Scalable Data Security
To understand the nuances of scalable security, we must delve into its core pillars: flexibility, performance, automation, and comprehensive coverage.
1. Flexibility
Flexibility is the cornerstone of scalable security. A rigid security solution that cannot adapt to changing needs and expanding data environments is destined to fail. Scalable security solutions must be flexible enough to integrate with a wide array of data sources, applications, and infrastructures, whether on-premises, in the cloud, or hybrid environments.
Flexibility also means accommodating varying security policies and compliance requirements. As regulations evolve and new threats emerge, a scalable security platform must allow for rapid adjustments to policies and controls without disrupting operations.
2. Performance
As data volumes grow, maintaining performance is crucial. Security measures that introduce latency or degrade performance are counterproductive and can hinder business operations and user experience. Scalable data security solutions must be designed to handle high throughput and large-scale environments without compromising o speed or efficiency.
Performance in scalable security also involves optimizing resource utilization. Efficient use of computational resources ensures that security operations, such as encryption, decryption, and monitoring, do not become bottlenecks as data scales.
3. Automation
Automation is a critical component of scalability in data security. Manual processes are time-consuming, error-prone, and incapable of keeping up with the dynamic nature of modern data environments. For instance, manually writing and maintaining SQL queries for data access control can be labour-intensive and prone to mistakes. Scalable security platforms leverage automation to ensure continuous protection without requiring constant human intervention.
Automated access policies, tokenization, and policy enforcement allow organizations to scale their security operations in line with their data growth. This automation enhances security posture and frees up valuable human resources to focus on strategic initiatives.
4. Comprehensive Coverage
Scalable security requires comprehensive coverage across all data assets and environments. It is insufficient to secure only certain parts of the data ecosystem while leaving others vulnerable. A genuinely scalable security solution provides end-to-end protection, encompassing data at rest, in transit, and use.
Comprehensive coverage also means detecting and mitigating threats across the entire attack surface. This includes monitoring for insider threats, external attacks, and vulnerabilities within the data infrastructure. Scalable security platforms employ advanced analytics and machine learning to provide real-time insights and proactive threat management.
The Nuances of Scalable Security
The complexity of scalable security lies in its ability to balance the varying demands of growth, performance, and protection. Here are some critical nuances to consider:
Future-Proofing
Scalable security solutions must be designed with future growth in mind. This involves anticipating the increase in data volume and users, the evolution of threat landscapes, and regulatory requirements. Future-proofing ensures that security investments remain practical and relevant as the organization evolves.
Interoperability
Interoperability is critical in a diverse data ecosystem. Scalable security platforms must seamlessly integrate with existing tools, applications, and processes. This integration capability ensures that security measures do not operate in silos but rather enhance the overall security posture through cohesive and collaborative defenses.
Cost-Effectiveness
As data scales, so do the costs associated with securing it. Scalable security solutions must provide a cost-effective approach to protection, balancing the need for robust security with budget constraints. One approach is to leverage native architectures to manage costs effectively.
The Stakes of Inadequate Scalability
The consequences of failing to implement scalable security measures are dire. As data grows unchecked by scalable security, organizations face an increased risk of data breaches, regulatory fines, and reputational damage. Here are some potential pitfalls:
Data Breaches
Without scalable security, the likelihood of data breaches increases significantly. Cybercriminals exploit vulnerabilities in outdated or inadequate security measures, leading to unauthorized access, data theft, and financial losses.
Regulatory Non-Compliance
Data protection regulations are becoming increasingly stringent. Organizations that fail to scale their security measures in accordance with these requirements risk non-compliance, which can result in hefty fines and legal repercussions.
Operational Disruptions
Inadequate security stability can lead to operational disruptions. Performance bottlenecks, system downtime, and compromised data integrity can impede business operations, leading to loss of productivity and revenue. Additionally, when security measures fail to scale, legitimate users may be unable to access critical data, causing further delays and hindering decision-making processes. This not only frustrates employees but also hampers overall business efficiency and agility.
Wrapping Up
In a world where data is both a valuable asset and a potential liability, the importance of scalable security cannot be overstated. As businesses continue to expand and generate more data, the need for robust, scalable security measures will only become more critical. Embracing scalable security is about protecting data today and preparing for tomorrow's challenges. The time to act is now.
Imagine waking up to the news that your company's sensitive data has been compromised, all due to stolen credentials. With recent high-profile data breaches making headlines, this nightmare scenario has become all too real for many organizations. The stakes are higher than ever, and ensuring robust security measures to protect your sensitive data in Snowflake is not just important—it's essential.
Snowflake's white paper, "Best Practices to Mitigate the Risk of Credential Compromise," is your roadmap to fortified security. This comprehensive guide reveals how to leverage Snowflake's native platform features to enforce strong authentication and mitigate the ever-present risks associated with credential theft. This blog will dive into the key takeaways and best practices recommended by Snowflake to safeguard your organization's data.
The Pillars of Security
Snowflake's approach to security is built on three key pillars:
Prompt
Encourage users to adopt security best practices, such as configuring multifactor authentication (MFA). This proactive approach ensures that users are aware of security protocols and actively engage with them. It's about creating a culture of security and mindfulness.
Enforce
Enable administrators to enforce security measures by default. This means implementing policies that automatically apply security best practices across the board, reducing the likelihood of human error or oversight.
Monitor
Provide visibility into security policy adherence. Monitoring ensures that security measures are not just in place but are being followed and are effective. Continuous visibility allows for timely adjustments and responses to potential threats.
By grounding its security framework in these pillars, Snowflake ensures a comprehensive approach to protecting sensitive data from unauthorized access.
Best Practices for Enforcing Authentication and Network Policies
To safeguard your Snowflake account, it's crucial to follow these essential steps:
1. Create Authentication Policies for Service Users
Use key pair or OAuth for programmatic access and enforce this through authentication policies. Service accounts, which are often targeted by attackers, should have the most stringent security measures. By using key pairs or OAuth, you ensure a higher security level than traditional username/password combinations.
2. Enforce MFA for Human Users
Leverage your own SAML identity providers with MFA solutions. For added security, enforce Snowflake's native MFA for users relying on native passwords. MFA adds an additional layer of security, making it significantly harder for attackers to gain access using stolen credentials.
3. Establish Robust Password Policies
Implement stringent password requirements and regular password changes. Strong passwords and regular updates reduce the risk of password-based attacks. Policies should include guidelines on password complexity and the frequency of changes.
4. Implement Session Policies
Define policies to enforce reauthentication after periods of inactivity. This helps to minimize the risk of unauthorized access from inactive sessions. Policies should specify session timeout periods and conditions for reauthentication.
5. Apply Account-level Network Policies
Restrict access to authorized and trusted sources only. By defining network policies, you can ensure that only trusted IP addresses and networks can access your Snowflake account, reducing the attack surface.
6. Protect Service Users
Differentiate between human and service users by setting user types, which helps in applying appropriate security measures. Service users often have elevated permissions, making them prime targets for attacks. By categorizing them appropriately, you can apply stricter security controls.
7. Apply and Test Policies
Apply password and session policies at the account level and test service users to ensure their effectiveness. Regular testing and validation of policies help identify potential gaps and ensure that security measures are working as intended.
8. Enforce Account-Level MFA
Apply MFA enforcement policies to ensure all human interactive users use MFA. This universal application of MFA ensures that every user accessing the system is authenticated through multiple factors, significantly enhancing security.
9. Leverage Snowflake's Trust Center
Utilize Snowflake's Trust Center to monitor MFA and network policy enforcement continuously. Monitoring helps maintain a robust security posture by providing insights into policy adherence and identifying areas for improvement. Additionally, consider CIS benchmarks for industry-standard security practices and guidelines.
Wrapping Up
The digital landscape is fraught with threats, and credential compromise remains a top concern for organizations. Implementing the best practices outlined here is your first line of defense. However, it's not enough to set these measures and forget them. Continuous vigilance, regular updates, and a proactive stance are crucial.
Snowflake is your ally in this ongoing battle, providing the necessary tools and insights to effectively monitor and enforce security policies. By leveraging Snowflake's robust security framework, you can ensure your organization stays ahead of potential threats.
In today's hyper-connected world, businesses thrive on data. Every transaction, customer interaction, and strategic decision is driven by the vast amounts of information collected and stored. This data fuels innovation, enhances customer experiences, and propels growth. Yet, with this immense power comes a chilling reality: data breaches are an ever-present threat. From stolen customer information to compromised intellectual property, the consequences for businesses can be catastrophic. As these threats escalate, the burning question remains - how much data security is truly enough for your business?
Unfortunately, the answer is frustrating – there might not be a magic number. Here's why:
The Impenetrability Illusion
Imagine a bank vault guarded by the most advanced security system. This is the traditional security mindset – an impenetrable fortress. However, cyberattacks are a relentless foe, constantly evolving to exploit new vulnerabilities faster than patches can be deployed. No system is truly invincible.
The Security-Usability Tightrope
The ideal security system for a business might resemble Fort Knox, but that's not practical for everyday operations. Requiring retinal scans, fingerprints, voice verification, and a complex 30-character password just to access your company's internal systems would be excessively secure but also frustrating and inefficient for employees. Striking a balance between robust security and user-friendly access controls is crucial for businesses to navigate the security-usability tightrope effectively. Companies must implement security measures that protect sensitive data without impeding productivity or causing undue stress for users.
The Cost Conundrum
Investing in a million-dollar security system might make sense for a financial institution safeguarding sensitive data, but it would be overkill for a small business.Security measures come with a price tag – software, hardware, and trained personnel. The cost of these measures must be weighed against the potential damage of a breach. Prioritizing security investments based on the specific risks and needs of the business is crucial to ensure that resources are used effectively and efficiently. Companies must find the right balance between adequate protection and financial feasibility.
The Insider Threat
Imagine a trusted employee leaking sensitive data. Even the most sophisticated security cannot defend against disgruntled employees or social engineering attacks. Human error and malicious intent are ever-present dangers. Security awareness training and a culture of data responsibility are essential.
The Evolving Threat Landscape
Hackers continuously shift tactics from brute-force attacks to phishing campaigns exploiting software vulnerabilities. As these threats evolve, security measures must also be dynamic and adaptable. Businesses must treat security as a fluid process, constantly changing to counter new and emerging threats effectively. This continuous adaptation is essential for staying ahead in the ever-changing landscape of cyber threats.
The Data Value Spectrum
Not all data is created equal. Financial records, medical information, and intellectual property require the highest level of security. Less sensitive data, like movie preferences, can be protected with less stringent measures. Security needs to be tailored based on data value.
So, what's the answer?
Perhaps it's not about achieving "enough" security but adopting a proactive security posture. This posture acknowledges the inherent risks, prioritizes data based on value, and employs a multi-layered defense strategy.
The Pillars of a Proactive Security Posture
While absolute security may be a myth, building a robust security posture can significantly reduce the risk of breaches and minimize damage if one occurs. Here are the key pillars of this approach, expanded for a deeper understanding:
Defense in Depth
Imagine a castle with a moat, drawbridge, and heavily fortified walls. This layered approach is the essence of in-depth defense. It involves deploying a variety of security controls at different points within a system. Firewalls act as the first line of defense, filtering incoming and outgoing traffic. Access controls ensure that only authorized users can access specific data. Encryption scrambles data at rest and in transit, making it unreadable even if intercepted.
This layering creates redundancy. If one control fails, others can still impede attackers. Additionally, it makes a complete breach significantly more difficult. Hackers must bypass multiple layers, considerably increasing the time and effort required for a successful attack.
Assume Breach
Security needs a"fire drill" mentality. We must assume a breach will occur and have a well-defined incident response plan in place. This plan outlines the steps to take upon detecting a breach, such as isolating compromised systems, containing the damage, notifying authorities, and restoring affected data. A well-practiced plan minimizes downtime, data loss, and reputational damage.
Continuous Monitoring
Security isn't a one-time fix; it's a continuous process requiring constant vigilance. This entails regularly scanning systems for vulnerabilities, updating software with the latest security patches, and educating employees about cybersecurity best practices. By continuously monitoring systems and fostering a culture of security awareness, businesses can significantly reduce the risk of successful attacks and ensure their data security remains robust and adaptive to evolving threats.
Security by Design
Integrating security considerations into every stage of the product or system development life cycle is crucial. Security features shouldn't be an afterthought bolted onto a finished product but should be an integral part of the design and development process from the very beginning. This proactive approach ensures that security is woven into the fabric of the system, providing a more robust, more resilient defense against potential threats.
Wrapping Up
In an era where data breaches are not a matter of if but when, businesses must adopt a proactive and holistic approach to data security. The question of how much data security is enough is not about reaching an endpoint but about creating a resilient and adaptive security posture. It's about balancing cost with risk, leveraging technology while addressing the human element, and continuously evolving to meet new challenges. In the end, the right amount of security is the amount that protects your business, your customers, and your reputation in an increasingly hostile digital landscape.
Recently, a significant data exfiltration event targeting Snowflake customer databases came to light, orchestrated by a financially motivated threat actor group, UNC5537. This group successfully compromised numerous Snowflake customer instances, resulting in data theft and extortion attempts. It's important to note that Mandiant's thorough investigation found no evidence suggesting that the cyber threats originated from Snowflake's own environment. Instead, every incident was traced back to compromised customer credentials.
In this blog post, we’ll dive into the key takeaways from Mandiant’s investigation. We’ll also share some actionable insight to bolster your data security – because staying alert and proactive is your best defense in safeguarding your organization’s data integrity.
Key Findings
Credential Compromise
The attacks primarily involved the use of stolen customer credentials, leading to unauthorized access and data theft.
Threat Hunting Guidance
Mandiant provided comprehensive threat hunting queries to detect abnormal and malicious activities, which are crucial for identifying potential incidents.
Common Attack Patterns
- Roles and Permissions Changes: Attackers frequently used the SHOW GRANT command to enumerate resources and adjust permissions, enabling broader access.
- Abnormal Database Access: Unusual spikes in access to databases, schemas, views, and tables were noted, indicating potential reconnaissance or data exfiltration activities.
- User and Query Analysis: Identifying patterns in user creation, deletion, and query frequencies helped in detecting anomalous behaviors.
- Error Rate Analysis: High error rates in query executions often indicated brute force attempts or misconfigured accounts used by attackers.
- High Resource Consumption: Large volumes of data queries and compression activities were linked to data staging and exfiltration efforts.
4 Critical Recommendations to Enhance Snowflake Security
Given these findings, it's imperative forSnowflake users to bolster their security measures. Here are some critical steps:
- Implement Multi-Factor Authentication (MFA): Ensure MFA is enabled for all user accounts to prevent unauthorized access even if credentials are compromised.
- Regular IAM Reviews: Conduct frequent reviews of roles and permissions to detect and mitigate any unauthorized changes.
- Enhanced Monitoring: Use advanced monitoring tools such as database activity monitoring (DAM) to track abnormal access patterns, high error rates, and unusual resource consumption.
- Threat Hunting Practices: Regularly perform threat hunting exercises using the guidance provided by Mandiant to stay ahead of potential issues.
Ask Yourself these Questions
As you reflect on the recent incidents, it’s crucial to reflect on the broader implications to your organization’s security. To ensure you are well-prepared and resilient against emerging threats, consider the following questions:
1. Are your current security measures sufficient to detect and prevent unauthorized access, especially from compromised credentials?
2. How often do you review and update your access controls and permissions? Is this easy to do for your business?
3. Do you have robust monitoring in place to detect unusual activities and high error rates in real-time?
4. What proactive threat detection strategies are you employing to identify potential issues before they cause significant damage?
By addressing these questions and strengthening your security posture, you can better protect your Snowflake environment from similar threats. If you're looking to enhance your data security capabilities or you are not confident in your answers to the above questions, consider investing in advanced data security software purpose-built for Snowflake. ALTR’s solutions offer comprehensive protection, continuous monitoring, and proactive threat detection to safeguard your valuable data assets.
Would you like to explore how our data security solutions can help you secure your Snowflake environment? Contact ALTR today to learn more and schedule a demo.
Data, its meticulous management, stringent security, and strict compliance have become pivotal to businesses' operational integrity and reputation across many sectors. However, the intricate maze of evolving compliance laws and regulations, as we discussed in a recent blog, poses a formidable challenge to data teams and stakeholders. This dynamic regulatory environment complicates the already intricate workflows of data engineers, who stand on the frontlines of ensuring data compliance, constantly navigating through a sea of changes to maintain adherence.
The Compliance Conundrum
The landscape of data compliance has shifted from a mere checkbox exercise to a continuous commitment to safeguarding data privacy and integrity. The advent of stringent regulations like the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA) in the United States, among others, has escalated the stakes. Each regulation has its unique set of demands, and failure to comply can lead to severe repercussions, including substantial fines and a damaged reputation. A recent study from Drata found that 74% of organizations state compliance is a burden, and 35% spend 1,000 to 4,999 hours on compliance activities.
For data engineers, this presents an incredibly daunting task. They are tasked with the critical responsibility of ensuring that the data architectures they develop, the databases they oversee, and the analytics they perform are in strict alignment with a complex array of regulations that vary not only by jurisdiction but also by the nature of the data. This requires a vigilant eye on the ever-changing regulatory landscape, an in-depth understanding of each law, and a clear comprehension of its applicability to the data they manage. This constant state of monitoring and adaptation disrupts standard workflows, delays projects, and introduces a layer of uncertainty into data operations.
Navigating Through With Automation and Scalable Data Security
Amid these challenges, automation and scalable data security shine as beacons of hope, promising to alleviate the burden on data engineers and enable them to concentrate on their core tasks.
Data Classification: The Starting Point
The critical process of data classification is at the heart of any robust data security and compliance strategy. It tackles the initial hurdle of deciphering which regulations apply to specific data sets by identifying and categorizing data based on sensitivity. Automating this foundational step ensures that data is consistently managed in line with its classification, simplifying the maze of compliance with regulations like GDPR and CCPA.
Dynamic Data Masking: Protecting Data in Real-Time
Dynamic Data Masking (DDM) emerges as a practical solution for the real-time protection of sensitive data, ensuring it remains accessible only to those with authorization. This tool is particularly pertinent to complying with regulations demanding strict data privacy and access controls, allowing data engineers to implement scalable data access policies without altering the actual data.
Database Activity Monitoring: The Watchful Eye
The continuous surveillance of database activities through Database Activity Monitoring is crucial for maintaining compliance. It enables the early detection of unauthorized access or anomalous data handling, which could indicate potential breaches or non-compliance. This tool is instrumental in keeping an audit trail, a prerequisite for many data protection regulations, ensuring any deviations from standard data access patterns are promptly addressed.
Tokenization: Minimizing Exposure Risk
Tokenization is a formidable shield for susceptible data types, such as Personal Health Information (PHI) or Payment Card Information (PCI), often under stringent regulatory scrutiny. By substituting sensitive data with non-sensitive equivalents, tokenization significantly reduces the risk of data exposure. It eases the compliance burden by narrowing the scope of data subjected to the most stringent regulations.
Format Preserving Encryption: Balancing Security and Usability
Format Preserving Encryption (FPE) allows organizations to secure data while preserving its usability, an essential factor for operational systems bound by data protection regulations. FPE ensures encrypted data remains functional within applications without modification, thus supporting compliance efforts by safeguarding data without hindering business processes.
Open Source Integrations: Streamlining Compliance
Integrating open-source tools for data governance facilitates a smoother compliance journey by automating and simplifying data management tasks. These integrations ensure consistent data handling practices, enhance data quality, and foster a comprehensive data governance framework capable of adapting to evolving regulations, thereby bolstering an organization's compliance posture in a scalable and efficient manner.
How Streamlined Compliance Fuels Business Growth
Navigating data compliance with automation and advanced data management brings significant benefits beyond mere regulatory adherence, enhancing operational efficiency and competitive positioning.
Accelerated Project Delivery
Automating compliance tasks liberates data engineers to concentrate on their core functions, significantly speeding up project timelines. Automation facilitates rapid adaptation to regulatory changes and maintains a constant state of compliance readiness, boosting productivity and enabling businesses to respond swiftly to market demands.
Elevated Data Quality
Implementing precise data classification and stringent access controls reduces the risk of errors and inconsistencies. This ensures a steady flow of accurate and reliable data through organizational pipelines, crucial for informed decision-making and maintaining operational integrity.
Competitive Edge
In today's data-sensitive environment, a strong reputation for data security and compliance can enhance customer trust and loyalty, offering a distinct competitive advantage. Demonstrable data protection meets regulatory requirements and fosters customer retention and brand differentiation, turning compliance into a strategic business asset.
Wrapping Up
While the ever-evolving landscape of compliance laws poses significant challenges, the path forward isn't about memorizing every regulation but about leveraging technology to create a culture of informed compliance. This allows data engineers to shift their focus from frantic firefighting to strategic data management, ultimately unlocking the true potential of the information they hold.
When talking to customers about data protection in Snowflake, a few things get a little mixed up with one another. Snowflake’s Tri-Secret Secure and masking are sometimes considered redundant with ALTR’s tokenization and format-preserving encryption (FPE) - or vice versa. What we’ll do in this piece is untangle the knots by clarifying what each of these is, when you would use each, and the advantages you have because you can choose which option to apply to each challenge you come across.
Snowflake’s Tri-Secret Secure is a built-in feature, and it requires that your Snowflake account is on the Business Critical Edition. Tri-Secret is a hybrid of the “bring your own key” (BYOK) and the “hold your own key” (HYOK) approaches to using customer-managed keys for the encryption of data at rest. [ProTip for the Snowflake docs: Tri-SecretSecure is essentially a brand name for the customer-managed keys approach, and if you read these docs understanding that, then these docs are a little clearer.] When you use customer-managed keys, there is often a choice between having to supply the key to the third party (Snowflake in this case) on an ongoing basis or only giving it when needed – BYOK and HYOK respectively. Snowflake effectively combines these approaches by having you provide an encrypted version of the key, which can only be decrypted when it calls back to your crucial management systems. So, you bring an encrypted version of the customer-managed key to Snowflake but hold the key that can decrypt it. Tri-Secret is used for the actual files that rest on disks in your chosen Snowflake cloud provider and is a transparent data encryption – meaning this encryption doesn’t require a user to be aware of the encryption involved. It protects the files on disk without affecting anything at run time.
Snowflake’s Dynamic Data Masking is a very simple yet powerful feature. This feature requires Enterprise Edition (or higher). When a masking policy is used to protect a column in Snowflake, at run time, a decision is made to return either the contents of a column or a masked value (e.g., a set of “****” characters). You can apply this protection to a column either directly as a column policy or via a tag placed on a column associated with a tag-based policy. When you need to ensure that certain individuals can never see the legitimate values in a column, then Dynamic Data Masking is a perfect solution. The canonical example is ensuring that the database administrators can never see the values of sensitive information when performing administrative tasks. However, there are slightly more complex instances of hiding information where masking falls short. You can easily imagine a circumstance where users may be identifiable across many tables by values that are sensitive (e.g., credit card numbers, phone numbers, or government ID numbers). You want users doing large analytics work to be able to join these objects by the identifiers, but simultaneously, you’re obligated to protect the values of those identifiers in the process. Clearly, turning them into a series of “***” won’t do that job.
This is where ALTR’s Tokenization and Format-Preserving Encryption (FPE) enter the story. We could spend hours parsing out the debate about if tokenization is a super class of FPE, vice versa, or neither. There are people with strong arguments on every side of this. We’ll focus on the simpler questions of what each feature is, and when it is best applied. First, let’s define what they are:
- Tokenization replaces values with tokens in a deterministic way. This means that you can rely on the fact that if there is a value “12345” in a cell and it’s replaced by the token “notin” in one table, then if you encounter that value in another table, it will also be “notin” each time it started as “12345.” So now you can join the two tables by those cells and get the correct result. A key concept here is that the token (“notin” in this example) contains no data about the original values in any way. It is a simple token that you swap in and out.
- Format-Preserving Encryption (FPE) is like tokenization since you’re also swapping values. However, the “tokens” in this case are created through an encryption process where the resulting value maintains both the information and its format. FPE might replace a phone number value of “(800) 416-4710" with “(201)867-5309.” Like the tokens, that replacement will be consistent so one can use it in joins and other cross-object operations. Unlike the tokens, these values are in the same “format” (hence the name and the phone number token looking exactly like a different phone number), which means they will be usable in applications and other upstream operations without any code changes. In other words, FPE won’t break anything; it only protects information.
ALTR has both Tokenization and Format-Preserving Encryption solutions for Snowflake, which are cloud-native and immensely scalable. In other words, they can both keep up with the insane scale demands of Snowflake workloads. The application-friendly FPE often seems like the only solution you need at first glance. However, there are reasons for choosing to use only Tokenization or perhaps both Tokenization and FPE in combination. The most common reason for going Tokenization only is due to regulatory constraints. Since the ALTR Tokenization solution can be run in a separate PCI scope, it gives folks the power to leverage Snowflake for workloads that need PCI data without having to drag Snowflake as a whole into PCI auditing scope. The most common reason we see folks run both Tokenization and FPE together is to stick to a strict least-privilege model of access. Since Tokenization removes all the information about the data it protects, some will choose to tokenize data while it flows through pipelines into and out of Snowflake and transform it to FPE while inside Snowflake to get the most out of the data in the trusted data platform.
Hopefully, it’s clear by now that the answer to the question “Which one of these should I use?” is: it depends. If you’re already on Snowflake’s Business Critical Edition, then using Tri-Secret Secure seems like a no-brainer. The extra costs involved are nominal, and the extra protection afforded is substantial. The real questions come when applying Snowflake’s Dynamic Data Masking and either or ALTR’s Tokenization and Format Preserving Encryption (FPE). Masking is a great option for many administrative use cases. If you’re not concerned about the user being able to do cross-object operations like joins and need to hide the data from them, then masking is easily the best choice. The moment there is the need for joins or similar operations, then ALTR’s Tokenization and FPE are the right places to turn. Picking between them is mostly a matter of technical questions. If you have concerns about application compatibility with the protected data, then FPE is your choice. If you want to keep the protected data away from the data platform, then Tokenization is the best option since FPE runs natively in Snowflake. And there are clearly times when you may have workloads complex enough that all of these can be used in combination for the best results. You’ve got all the options you could ever need for Snowflake data protection. So now it’s time to get to work making your data safer than ever.
On June 10, 2024, cybersecurity research and response firm Mandiant published its findings on the ongoing security investigation of stolen customer data. This news was first broken to the public about Ticketmaster and Santander Bank on May 31, 2024.
Mandiant reports, “Mandiant’s investigation has not found any evidence to suggest that unauthorized access to Snowflake customer accounts stemmed from a breach of Snowflake's enterprise environment. Instead, every incident Mandiant responded to associated with this campaign was traced back to compromised customer credentials.”
If there is any relief for Snowflake customers, it’s that Snowflake’s platform itself had not been compromised - which could have led to the exposure of more than 9,000+ customer data sets. Instead, Mandiant is reporting that 165 potential companies were exposed. Why is this good news? This means Snowflake is a safe platform to store and use your data. Like any other cloud-based service, you must take steps to protect your data beyond what the vendor does for you. Understanding what you can do to strengthen your defenses is crucial.
There are many ways to understand how to approach data security in cloud-based SaaS systems. We’ll borrow Gartner’s. The above diagram breaks down the responsibilities of the customer and vendor for IaaS, PaaS and SaaS. Snowflake fits best in the SaaS pilar, and Snowflake’s nine security responsibilities for data and systems are shown in green, indicating they are unaffected by this incident. The two responsibilities in blue, People and Data, remain under the control of Snowflake’s Customers.
Customers are responsible for what data they put in Snowflake, which users they allow access to this data, and how that access is controlled. But Snowflake does not entirely leave the People and Data responsibilities squarely on their customers. They recognize the importance of keeping data safe and have built industry-leading security and governance capabilities that they provide to customers of all sizes. From role-based access controls (RBAC) to dynamic data masking, network access restrictions, and more, Snowflake helps customers with the remaining two security responsibilities of People and Data.
So why did this data exposure happen if Snowflake is fulfilling its responsibilities and assisting customers with theirs? Managing People, Data, and security is challenging regardless of an organization’s size or maturity. This is where ALTR comes in.
About ALTR
ALTR is a Data Security Platform specifically designed to help customers with their two data security responsibilities. ALTR does two things to help customers manage their People and Data: automate and scale the powerful Snowflake-provided native security and governance capabilities mentioned above and extend Snowflake’s security capabilities with Active Security measures.
For the first part, ALTR can connect to your Snowflake, leverage data classification or Snowflake Object Tagging, and ensure that only authorized users can access data according to company policy. All this happens without writing a single line of code. Your data people don’t have to become security experts, and your security people don’t have to learn SQL. This does not replace the built-in Snowflake capabilities –it depends on them. Snowflake’s enforcement layer is still the engine for applying the advanced ALTR capabilities. This includes RBAC, dynamic data masking and row-access policies, to name a few.
ALTR also provides detailed information and reporting for data and infosec teams to prove they follow data access compliance rules and deliver that reporting in near real-time.
ALTR’s Active Security capabilities are used by Snowflake’s most sensitive and regulated business to ensure Snowflake is safe for PII, PHI, and PCI information. However, these capabilities are not limited to only large or mature businesses. Active Security can help a small or young company secure one row of customer data in Snowflake.
Active Security includes Database Activity Monitoring, Data Access Rate Limiting (Thresholding), and cell-level data protection in the form of encryption or tokenization.
Database Activity Monitoring
Database Activity Monitoring adds near-real-time logging and alerting capabilities to Snowflake, where Snowflake logging can be delayed as much as four hours after access. ALTR can send data access logs in seconds to security teams for analysis and processing. This dramatic reduction of time is difficult to do at the scale of Snowflake but is necessary to keep the most sensitive data in Snowflake. Customers can be alerted in near-real-time, within seconds of access, to check if these accesses are valid or seem suspicious.
Data Access Rate Limiting, or Thresholding
Data Access Rate Limiting, or Thresholding, is a patent-issued feature exclusive to ALTR which can stop data access in real-time, even with valid credentials. Customers can set a policy indicating how much data a particular user can consume in a period. Once a user reaches their limit, their access to that data is blocked.
No other data access is limited for that user, and no other users are impacted by a single user reaching their limit. Users can log in to Snowflake, but if the limit has been met for the day, no more data will flow to that user. When combined with ALTR’s Database Activity Monitoring, customers can be alerted instantly when a user has reached their limit and decide what to do with that user.
Cell-level Data Protection
Cell-level data protection takes the same type of on-disk data protection that Snowflake provides with Tri-Secret-Secure (TSS) and extends it deeper into the data. The purpose of cell-level protection with encryption or tokenization is to remove the single-party risk of Snowflake holding the data and encryption keys by adding a second (or even third) party to the equation. In this way, compromising a Snowflake user account does not necessarily mean the data can be compromised, making Snowflake safer.
With ALTR’s tokenization or Format-Preserving Encryption Native App, the data or the keys to decrypt the data are stored outside of Snowflake. When authorized users request access to the plain text, Snowflake and ALTR interact in real-time to provide the plain text data. This operates at the scale of Snowflake and uses ALTR’s SaaS platform in the mix.
How Could ALTR Have Prevented Customer Data Exposure?
Customers should follow all recommended Snowflake security best practices for user accounts, such as multifactor authentication and network access limitations for user accounts. But sometimes that’s not enough.
In this case, we have a simple answer to the above question of how ALTR could have helped stop or limit the exposure.
1. Security teams were unable to see the data exfiltration in near real-time. They were limited to the default delays of up to four hours after the data had been stolen. Installing ALTR’s Database Activity Monitor into your Snowflake account and hooking up the output of ALTR’s real-time logs to your email, chat system or SIEM tool would have notified the business to investigate the user accounts immediately. “Why would someone from outside the country be accessing all our customer data at this time of night? We should investigate right away.”
2. Cell-level protection, like ALTR’s FPE Native App, would have rendered the data access useless as the accounts likely would not have been given access to the decryption keys. ALTR’s FPE Native app is format-preserving and deploys determinism – meaning an email will still look like an email, and the protected values remain operational downstream without your users needing to decrypt and see them in plaintext. This means as the bad actors ran SELECT statements over the data, they would have received encrypted data back without receiving the encryption key. This makes data exfiltration useless and is why having a two-party system of data security is so widely used because it's effective.
3. In the case the impacted user accounts did have access to the decryption key by compromising an elevated permission user, ALTR’s Thresholds could have been configured to do two things: alert in real-time when more than 100,000 rows have been accessed by a single user in 1hr for example, and then cut off access to that same data after 500,000 rows of data were accessed by a single user. The user would be authenticated to Snowflake and allowed to access the table, but no data would come out. That impacted user would then be in the ‘penalty box’ without the ability to decrypt information further.
Active Security is the best way to ensure sensitive data is safe in Snowflake no matter what happens. Active Security can detect and stop a breach, not just notify you. All three Active Security features are in GA and running in production across many Snowflake user accounts today. Our product and team focus on one thing only: safeguarding sensitive data.
We're thrilled to announce that ALTR's Snowflake native app, Format-Preserving Encryption (FPE), is now available on the Snowflake Marketplace. This marks a significant step forward in our mission to make data security seamless, efficient, and scalable for our customers. Let's dive into what this means for you.
What is Format-Preserving Encryption (FPE)?
Imagine encrypting your sensitive data without altering its original structure or format. That's precisely what FPE does. It transforms plaintext data into ciphertext while keeping the same format. For example, a phone number like "(800) 416-4710" might be encrypted as"(201) 867-5309." This means your applications and systems can continue operating smoothly without needing changes to handle encrypted data.
Why is This a Big Deal?
Traditionally, encrypting data involved a lot of headaches. On-premises systems were expensive, costing millions of dollars per license, and they introduced significant lag because of the back-and-forth calls between Snowflake and the on-premises servers. This not only slowed down your queries but also burned a hole in your pocket with monthly costs.
With ALTR's Snowflake Native FPE, all the encryption and decryption happen locally within Snowflake. No more external calls, no more lag—just fast, secure data processing. Plus, your data stays protected at rest within the Snowflake Data Cloud, ensuring it's always secure.
How Does Snowpark Make This Possible?
Snowpark, Snowflake's developer framework, provides the perfect environment for our FPE solution. It supports fully functional applications, enabling us to deliver powerful encryption directly in Snowflake. This means you get top-notch data protection without compromising performance or ease of use.
Why Should You Care About ALTR's FPE on Snowflake?
Here's why this is excellent news for you:
Simplified Data Protection
ALTR's FPE integrates seamlessly with our existing data access control and security solutions. This means you can easily implement and manage comprehensive data security through our SaaS platform, no-code interface, and automated policy enforcement.
Cost Savings and Efficiency
You save millions in licensing fees and monthly operational costs by eliminating the need for on-premises appliances. Plus, faster query response times make your data operations more efficient.
Future-Proof Security
FPE ensures that your sensitive data is always protected, even as you scale and evolve your data ecosystem. It's particularly beneficial for industries like financial services and healthcare, where maintaining data interoperability with legacy systems is crucial.
What Do Our Customers Think?
"ALTR's FPE offering running natively in our Snowflake environment proved to be far more effective, scalable, and affordable than the legacy solutions we considered. Further, with ALTR's cloud-native, SaaS architecture, we could extend FPE upstream into our data pipeline, expanding our compliance footprint to include a staging area prior to workloads landing in Snowflake."
Craig Hipwell, Customer Platforms Delivery Manager,Shell Energy Customer Platforms Delivery Manager,
Get Started Today
With ALTR's FPE now available on the Snowflake Marketplace, you have all the tools you need to protect your data efficiently, effectively and at scale. It's time to take your data security to the next level.
Explore our FPE solution on the Snowflake Marketplace and see how easy it can be to keep your data safe while maintaining top performance.
Q&A with Ed Hand
1. Please share a bit about your background
I’ve spent the last two decades in enterprise software sales, where I’ve had the privilege of building and leading high-performance sales and marketing teams. My career has taken me from established, large-scale organizations to dynamic, ground-zero startups. Throughout this journey, I’ve successfully brought together all facets of Go-To-Market strategies under a cohesive team structure. My expertise lies in navigating complex ecosystems such as ServiceNow and Snowflake, where I’ve developed comprehensive market strategies that drive growth and success. I’ve consistently focused on aligning sales initiatives with broader business goals, ensuring sustainable revenue streams and long-term customer relationships.
2. What motivated you to join ALTR?
Several factors influenced my decision to join ALTR. First and foremost, I was thoroughly impressed by the ALTR team. Their deep understanding of current data security challenges and their forward-thinking approach to simplifying and scaling data security stood out to me. Additionally, the opportunity to be at the forefront of the cloud data revolution is incredibly exciting. As businesses increasingly migrate their critical data to the cloud, they adopt advanced technologies like machine learning and artificial intelligence to gain competitive advantages. ALTR's dedication to helping clients balance this technological innovation with robust cloud data security makes it an inspiring endeavour to be part of.
3. What is your vision for ALTR, and how do you see the company evolving under your leadership?
My vision for ALTR is to establish us as the de facto standard for cloud data security. This involves offering a robust, rock-solid platform and leading the industry with unmatched security expertise. Under my leadership, I aim to drive the company past critical growth milestones typical for a thriving SaaS enterprise. This includes expanding our market reach, continuously innovating our product offerings, and maintaining a relentless focus on customer satisfaction. I foresee ALTR evolving into a cornerstone of data security, trusted by organizations worldwide to protect their most valuable asset: their data.
4. How have you seen the data security and governance landscape change throughout your career, and where do you think it is headed in the next five years?
Over the years, data security and governance have undergone significant transformations. The landscape has evolved from simple perimeter defenses to sophisticated, multi-layered security strategies. Despite these advancements, cybercriminals continue to outpace many enterprises due to the high stakes involved. The fundamental principles of data security – knowing where your data is, who has access to it, and ensuring its protection – remain unchanged. However, in the next five years, the challenge will lie in meeting these requirements at the scale and speed of the cloud.
5. From your perspective, what makes ALTR the best solution for organizations looking to enhance their data security and governance practices?
Building a successful software company hinges on four key pillars:
Product: It all starts with having a viable and innovative product that addresses real market needs. ALTR excels here with its scalable data security solutions tailored for the cloud era.
Market: Identifying and targeting an addressable market is crucial. The demand for robust data security and governance solutions grows exponentially as more businesses move to the cloud.
Defense: Defending your market position against competitors is essential. ALTR’s advanced technology, coupled with its deep industry expertise, provides a formidable defense.
Team: The most critical element is having a team that can build and execute together effectively. At ALTR, we have a dedicated, talented team committed to our mission of simplifying data security and data access governance.
These pillars make ALTR an exceptional solution for organizations seeking to enhance their data security and governance and make it an attractive place for top talent in the industry. By joining ALTR, professionals can work on the frontlines of data security innovation, contributing to solutions that make a real difference in today’s digital landscape.
Connect with Ed on LinkedIn
For data engineers, there's a comforting hum in the familiar, a primal urge to build things ourselves."DIY is better," whispers the voice in their heads. But when it comes to data masking in Snowflake, is building policies from scratch the best use of our time?
Sure, the initial build of a masking policy might be a quick win. You get that rush of creation, the satisfaction of crafting something bespoke. But here's the harsh reality: that initial high fades fast. Masking policies are rarely static. Data evolves, regulations shift, and suddenly, your DIY masterpiece needs an overhaul.
This is where the actual cost of the"DIY is better" mentality becomes apparent. Let's delve into the hidden complexities that lurk beneath the surface of Snowflake's manual masking policies.
The Version Control Vortex
Ah, version control. The unsung hero of software development. But when it comes to DIY masking policies, it can be atangled mess. Every change, every tweak you make, needs to be meticulously documented and tracked. One wrong move, and you could be staring down the barrel of a data breach caused by an outdated policy.
Imagine the chaos if multiple engineers are working on the same masking logic. How do you ensure everyone is on the same page? How do you revert to a previous version if something goes wrong? While Snowflake recently announced a Private Preview for version control via Git, with a purpose-built UI like ALTR, version control is baked in and highly user-friendly. There is no need for complex terminal commands –just intuitive clicks and menus. Changes are tracked, history is preserved, and rollbacks are a breeze.
The Snowflake Object Management Maze
Snowflake offers a seemingly endless buffet of objects – a staggering 74 and counting, with new additions continually emerging. However, managing these objects poses a central challenge within the Snowflake ecosystem.
For instance, while masking policies reside within schemas, their impact extends far beyond. A single masking policy can be applied to tables and columns across numerous schemas within your Snowflake account.
This creates a masking policy headache. Choosing the correct schema for each policy is crucial, as poor placement leads to confusion and complex updates. Furthermore, meticulous documentation is essential to track policy location and impact. Without it, any changes or troubleshooting become a nightmare due to the potential for widespread, unforeseen consequences across your Snowflake environment.
With ALTR, you do not have to consider object management when masking policies. With our unified interface, you can easily create, edit, and deploy policies automatically in seconds, eliminating the need to navigate the intricate web of Snowflake objects and their relationships.
The Update and Maintenance Monster
Data masking policies are living documents. As your data landscape changes, so too should your masking logic. New regulations might demand a shift in how you mask specific fields. A data breach requires you to tighten masking rules.
With DIY policies, every update becomes a time-consuming ordeal. You must identify the relevant policy, modify the logic, test it thoroughly, and then deploy the changes across all affected Snowflake objects. Multiply that process by the number of policies you have, and you've just booked a one-way ticket to Update City – population: you, stressed and overworked.
ALTR simplifies this process. Its intuitive UI allows for quick and easy changes to policies. Updates can be deployed across all relevant objects with a single click, eliminating the need for manual deployment across potentially hundreds of locations.
The Validation Vortex
Let's not forget the critical step of validation. Every change you make to a masking policy must be rigorously tested to ensure it functions as intended. This involves creating test data, applying the new masking logic, and verifying that the sensitive data is adequately protected.
Imagine manually validating dozens of masking policies across hundreds or thousands of tables and columns. It's a daunting task, and relying solely on automated pipelines for testing adds another layer of complexity that needs ongoing maintenance. It's enough to make any data engineer break out in a cold sweat.
Beyond Time Saving: The BiggerPicture
The benefits of ditching DIY masking policies extend far beyond just saving time. It's about empowerment. With ALTR's easy-to-use UI, even non-technical users can create and edit masking policies. This frees up valuable engineering time, allowing you to focus on more strategic initiatives. It also fosters a culture of data ownership and responsibility, where everyone involved understands the importance of data security.
Let's face it: the "DIY is better" mentality can be a trap in data masking. It might seem like a quick win initially, but the long-term costs – time, complexity, and risk – are too high. Embrace the power of purpose-built tools like ALTR. Free your engineering time, empower your team, and ensure your data is masked effectively and efficiently.
Ready to ditch the DIY trap? Schedule an ALTR demo.
Snowflake Arctic and the Future of AI Governance If you’re reading this, then it’s certain you saw the news about Snowflake’s Arctic model launch. Machine learning and AI is the next natural step for the Snowflake Data Cloud. Not only because it’s a hot trend, but because the Snowflake story naturally leads you to AI. What makes machine learning better? Lots of data. Where are you putting more and more of your data? Snowflake. Of course, there’s no such thing as a free lunch. While your data scientists, developers, and all the other Snowflake enthusiasts in your orbit are rushing to see how they can start leveraging Arctic (and there are already ways popping out of the Snowflake teams as well), maybe you’re here because you have accountability for your organization’s data. You may have one very important question: how is Arctic going to affect my governance and security stance? We’re here to answer that question, and the answer is mostly good – if you’re going to do the right things right now.
The TLDR on this is simple. Arctic is like every other thing that runs in the Snowflake Data Cloud. Nothing in Snowflake escapes the watchful eye of Snowflake governance policies. Nothing in Snowflake can skip past the network controls, security checks, encryption, or RBAC (Role-Based Access Control). The simplest way to understand this is that to use all this power in the Arctic LLM you have a list of simple, built-in Snowflake functions. You only have permission to use the AI stuff if you have permission to use those functions. And you only have permission to feed data that you are already allowed to access into those functions. Simple, right? End of the story, right? If that were the end, that would also be the end of this post. Honestly, I probably wouldn’t have bothered to write it if that were the case.
While it’s true that AI access is limited to the Cortex functions and that people will only be able to bring the data they already have access to into those functions, when you combine AI and the huge wells of data that Snowflake tends to have things may get weird. It’s not unusual for people (or services) to be over-provisioned. Just yesterday we were on the line with a prospect who was shocked to see ALTR’s real-time auditing picking up dozens of jobs running under the Snowflake SYSADMIN role. These queries running with too much privilege happened because lots of folks were granted this role through nesting to make it easier for them to get some data that had been put in a database that it probably shouldn’t have been in, and it was easier to grant the role than move the data. (This sort of security gap is exactly why this company is looking at ALTR in the first place!) With that SYSADMIN role, those users could have accessed tons of stuff they weren't supposed to. They didn’t (we know that because ALTR’s auditing would have caught them), but since they had the access, they could have. Humans tend to only query data they know they have access to. But what happens when AI takes the wheel?
Right now, the impact that AI’s power can have in Snowflake is limited. But just like having a model like Snowflake’s Arctic was the next natural step in the Snowflake story, there are more natural steps we can imagine. People are going to throw all the data they have at this thing to attempt to get amazing results. What happens when they have access to data they shouldn’t? What happens when they should have access to a table, but maybe there’s sensitive information in columns and there needs to be advanced data protection in place to make that data usable in the context of Cortex, Arctic, and AI in general? The machines won’t use the same approaches humans will (and vice versa). That’s why humans and AIs make such an effective team when things go right. But that also means these LLMs won’t limit themselves to only what they know. They will crawl through every scrap of data they have access to trying to find the right answer to get that good feedback we’ve programed them to seek. What happens when that machine is mistakenly given SYSADMIN role like the humans were? And, of course, people are going to build fully automated systems where the AI-powered machines will run all the time pushing these boundaries. Humans sleep, take time off, and eat a meal every now and then. What happens when your governance and security must be on watch 24/7 because they’re contending with machines that never step away?
The good news is that we’re only standing on the tip of this iceberg (pun intended). Most of this stuff is still a little while away. But as with everything else related to AI, it’s going to move fast. So now more than ever it's crucial that security and governance be integrated into the data and development pipelines and CI/CD approaches as well as automated as much as possible. Snowflake has all the controls you need to prevent the bad stuff from happening, but you need to use them effectively and automatically. The sensitive information in your data needs special attention more than ever in an AI-powered world. In that conversation yesterday, the customer asked about the new Arctic stuff and how ALTR could address that even though it just dropped this month. The answer is simple: ALTR has been in the proactive security business since the start. Since Snowflake did the right thing by building security directly into the Arctic and AI design, it’s just another thing ALTR can help you lock down as you roll it out. It all fits together perfectly. The next natural step in that company’s story – and maybe in yours – is to decide to let us help them out. We’re ready for AI when you are.
The data deluge is absolute. Organizations are swimming in an ever-growing sea of information, struggling to keep their heads above water. With its rigid processes and bureaucratic burdens, traditional data governance often feels like a leaky life raft – inadequate for navigating the dynamic currents of the modern data landscape.
Enter agile data governance, the data governance equivalent of a high-performance catamaran, swift and adaptable, ready to tackle any challenge the data ocean throws its way.
What is Agile Data Governance?
Traditional data governance often operates siloed, with lengthy planning cycles and a one-size-fits-all approach. Agile data governance throws this rigidity overboard. It's a modern, flexible methodology that views data governance as a collaborative, iterative process.
Here's the critical distinction: While traditional data governance focuses on control, agile data governance emphasizes empowerment. It fosters a data-savvy workforce, breaks down silos, and prioritizes continuous improvement to ensure data governance practices remain relevant and impactful.
The Seven Pillars of Agile Data Governance
Collaboration
Gone are the days of data governance operating in isolation. Agile fosters a spirit of teamwork, breaking down silos and bringing together data owners, analysts, business users, and IT professionals. Everyone plays a role in shaping data governance practices, ensuring they are relevant and meet real-world needs.
Iterative Approach
Forget lengthy upfront planning that quickly becomes outdated in the face of evolving data needs. Agile embraces a "test and learn" mentality, favoring iterative cycles. Processes are continuously refined based on ongoing feedback, data insights, and changing business priorities.
Flexibility
The data landscape is a living, breathing entity, constantly shifting and evolving. Agile data governance recognizes this reality. It's designed to bend and adapt, adjusting sails (figuratively) to navigate new regulations, integrate novel data sources, or align with evolving business strategies.
Empowerment
Agile data governance is not about control; it's about empowerment. It fosters a data-savvy workforce by prioritizing training programs that equip employees across the organization with the skills to understand, use, and govern data responsibly. Business users become active participants, not passive consumers, of data insights.
Continuous Improvement
Agile data governance thrives on a culture of constant improvement. Regular assessments evaluate the effectiveness of data governance practices, identifying areas for refinement and ensuring that the program remains relevant and impactful.
Automation
Repetitive, mundane tasks are automated wherever possible. This frees up valuable human resources for higher-value activities like data quality analysis, user training, and strategic planning. Data classification, access control management, and dynamic data masking are prime candidates for automation.
Metrics and Measurement
Agile thrives on data-driven decision-making. Metrics and measurement are woven into the fabric of the program. Key performance indicators (KPIs) track the effectiveness of data governance initiatives, providing valuable insights to guide continuous improvement efforts. These metrics can encompass data quality measures, access control compliance rates, user satisfaction levels with data discoverability, and the impact of data insights on business outcomes.
Why Agile Data Governance is Critical in 2024
The data landscape in 2024 is a rapidly evolving ecosystem. Here's why agile data governance is no longer optional but a strategic imperative:
The Ever-Shifting Regulatory Landscape
Regulatory environments are becoming more dynamic than ever. Agile data governance allows organizations to adapt their practices swiftly to ensure continuous compliance with evolving regulations like data privacy laws (GDPR, CCPA) and industry-specific regulations.
Unlocking the Potential of AI
Artificial intelligence (AI) is transforming decision-making across industries. Agile data governance ensures high-quality data feeds reliable AI models. The focus on clear data lineage and ownership within agile data governance aligns perfectly with the growing need for explainable AI.
Democratizing Data for a Data-Driven Culture
Agile data governance empowers business users to access, understand, and utilize data for informed decision-making. This fosters a data-driven culture where valuable insights are readily available to those who need them most, driving innovation and improving business outcomes.
Optimizing for Efficiency and Agility
The iterative approach and automation focus of agile data governance streamline processes and free up valuable resources for higher-value activities. This allows organizations to navigate the complexities of the data landscape with efficiency.
Is Your Data Governance Agile? Ask Yourself These 10 Questions
Are your current data governance practices keeping pace with the ever-changing data landscape? Here are ten questions to assess your organization's agility:
1. Do different departments (IT, business users, data owners) collaborate to define and implement data governance practices?
2. Can your data governance processes adapt to accommodate new data sources, changing regulations, and evolving business needs?
3. Are business users encouraged to access and utilize data for decision-making?
4. Do you regularly evaluate the effectiveness of your data governance program and make adjustments as needed?
5. Are repetitive tasks like data lineage tracking and access control automated?
6. Do you track key metrics to measure the success of your data governance program?
7. Do you utilize an iterative approach with short planning, implementation, and improvement cycles?
8. Does your organization prioritize training programs to equip employees with data analysis and interpretation skills?
9. Are data governance policies and procedures clear, concise, and accessible to all relevant stakeholders?
10. Do business users feel confident finding and understanding the data they need to make informed decisions?
By honestly answering these questions, you can gain valuable insights into the agility of your data governance program. If your answers reveal a rigid, one-size-fits-all approach, it might be time to embrace the transformative power of agile data governance.
Wrapping Up
Agile data governance is not just a trendy buzzword; it's a critical approach for organizations in 2024 and beyond. By embracing its principles and building a flexible framework, organizations can transform their data from a burden into a powerful asset, propelling them toward a successful data-driven future.
Our customers are confused. Given the state of the world, it’s safe to say everyone is a little confused now. The confusion we’re concerned with today is about the markets ALTR plays in and how the analysts of the world – particularly Gartner – are breaking those down and making recommendations. What we’ll aim to do here is analyze the analysis. We’ll lay out the questions customers are asking about the markets and solutions for Data Security Posture Management (DSPM) and Data Security Platform (DSP), see what Gartner is saying about those today, offer some reasons why we think they are right, and finally show why the confusion is real.
Maybe that seems like a contradictory stance to take, but let’s not forget what F. Scott Fitzgerald told us: “The test of a first-rate intelligence is the ability to hold two opposing ideas in mind at the same time and still retain the ability to function.” By the end of this post, it should be clear that Gartner and others have only correctly identified a confusing time in data governance and security; they have not made things any more confusing.
Let’s start out where customers have told us they get confused. We’ll go right to the source and quote from Gartner’s own public statements on DSPM and DSP. First, let’s look at how they define Data Security Posture Management:
Data security posture management (DSPM) provides visibility as to where sensitive data is, who has access to that data, how it has been used, and what the security posture of the data stored, or application is.
(Source: https://www.gartner.com/reviews/market/data-security-posture-management as of March 26th, 2024)
We could pick that apart right away, but instead let’s immediately compare it with their definition of a Data Security Platform:
Data security platforms (DSPs) combine data discovery, policy definition and policy enforcement across data silos. Policy enforcement capabilities include format-preserving encryption, tokenization and dynamic data masking.
(Source: https://www.gartner.com/reviews/market/data-security-platforms as of March 26th, 2024)
At first glance, these seem incredibly similar – and they are. However, there are important differences in the definitions’ text, in their implied targets, and in the implications of these factors. The easiest place to see a distinction is in the second part of the DSP definition: “policy definition and policy enforcement." The Data Security Platform does not only look at the “Posture” of that system. It is going to deliver a security solution for the data systems where it’s applied.
When talking to customers about this, they will often point out two details. First, they will say that if the DSP can’t do the discovery of at least the policy of the data systems then it isn’t much good that they give you ways to manage the protection. The subtlety here is that controlling the data policy implies that the solution would discover the current policy in order to control it going forward. (While it’s possible that some solution may give you policy control without policy discovery, ALTR gives you all those capabilities, so we don't have to worry about that.) The second thing they point out is that many of the vendors who are in the DSPM category also supply “policy definition and policy enforcement” in some way. That brings us to discussing the targets of these systems.
Something you will note as a common thread for the DSPM systems is how incredibly broad their support is for target platforms. They tend to support everything from on-prem storage systems all the way through cloud platforms doing AI and analytics like Snowflake. The trick they use to do this is that they are not concerned with the actual enforcement at that broad range, and that’s appropriate. Many of the systems they target, especially those on-prem, will have complicated systems that do policy definition and enforcement. Whether that’s something like Active Directory for unstructured data stored on disk or major platforms like SAP’s built-in security management capabilities, they are not looking for outside systems to get involved. However, the value of seeing the permissions and access people use at that broad scope can be very important. Seeing the posture of these systems is the point of the DSPM.
Of course, a subset of the systems will allow the DSPM to make changes that can be effective easily without requiring them to get too deep. If it’s about a simple API call or changing a single group membership, then the DSPM can likely do it. However, in systems where there are especially complex policies those simple, single API calls become about the “policy definition and policy enforcement" in the Data Security Platform definition. The DSP will get deep within the systems they target. Often, part of the core value of a DSP is that it will simplify what are extremely complicated policy engines and give ways to plug these policy definition steps into the larger scope of systems building or the SDLC. That focus and depth on the actual controls in targeted systems is the main difference between DSPM and DSP. The Data Security Platform narrows the scope, but it deepens the capabilities to control policies and to deliver security and governance results.
The other important aspect of the distinction between these solutions is the Data Security Platform capabilities for Data Protection. That’s the “format-preserving encryption, tokenization and dynamic data masking” part of the DSP definition. Many data systems will have built-in solutions for data masking. Almost none will have built-in tokenization or format-preserving encryption (FPE). If these capabilities are crucial to delivering the data products and solutions an organization needs, then DSP is where they will look for solutions. This not only impacts data use in production settings, but often is associated with development and testing use cases where use of sensitive information is forbidden but use of realistic data is required.
Let’s recognize the elephant in the analysis: DSPM and DSP are going to have overlap. If you’ve been around long enough or have read deeply enough, that should be as shocking as the fact that (if you’re in an English-speaking part of the world) the name of this day ends in “y.” Could the DSP forgo all the core capabilities of DSPM and just deliver the deeper policy and data protection features? If the DSM vendors could be sure that every customer will have DSPM to integrate with, sure. That isn’t always the case. Even if it were, it’s not guaranteed that the politics and process at an organization would make such integration possible even if it is technically possible. Could DSPM simply expand to cover all the depth of DSP including the Data Protection features? The crucial word in there is “simply.” If it were simple they would have done it already.
It’s sure that you will see consolidation of the market over time with players merging, expanding, and being bought to make suites. Right now, organizations have real-world challenges, and they need solutions despite the overlaps. So DSPM and DSP will stay independent until market forces make it necessary for them to change.
The overlaps, the similar goals, and the limits of language in describing Data Security Posture Management and Data Security Platforms are the source of the confusion. Hopefully, it’s now clear that DSP is the deeper solution that gives you everything you need to solve problems all the way down to Data Protection. DSPM will continue to add more platforms to grow horizontally. DSP will continue to dive deeply into the platforms they support today and cautiously add new platforms to dive more deeply into as the market needs them to. If you started this a little mad at the Gartners of the world, maybe you now see how they are right to give you two different markets with so much in common. Like with many things in life, if you are confused, it only means you are sane and paying attention. You keep paying attention, and we’ll keep helping you stay sane.
Data privacy laws are not just a legal hurdle – they're the key to building trust with your customers and avoiding a PR nightmare. The US, however, doesn't have one single, unified rulebook. It's more like a labyrinth – complex and ever-changing.
Don't worry; we've got your back. This guide will be your compass, helping you navigate the key federal regulations and state-level laws that are critical for compliance in 2024.
The Compliance Challenge: Why It Matters
Data breaches are costly and damaging. But even worse is losing the trust of your customers. Strong data privacy practices demonstrate your commitment to safeguarding their information, a surefire way to build loyalty in a world where privacy concerns are at an all-time high.
Think of it this way: complying with data privacy laws isn't just about checking boxes. It's about putting your customers first and building a solid foundation for your business in the digital age.
US Data Privacy Laws: A Multi-Layered Maze
The US regulatory landscape is an intricate web of federal statutes and state-specific legislation. Here's a breakdown of some of the key players:
Federal Protections
These laws set the baseline for data privacy across the country.
Privacy Act of 1974 restricts how federal agencies can collect, use, and disclose personal information. It grants individuals the right to access and amend their records held by federal agencies.
Health Insurance Portability and Accountability Act (HIPAA) (1996) sets national standards for protecting individuals' medical records and other health information. It applies to healthcare providers, health plans, and healthcare clearinghouses.
Gramm-Leach-Bliley Act (GLBA) (1999): Also known as the Financial Services Modernization Act, GLBA safeguards the privacy of your financial information. Financial institutions must disclose their information-sharing practices and implement safeguards for sensitive data.
Children's Online Privacy Protection Act (COPPA) (2000) protects the privacy of children under 13 by regulating the online collection of personal information from them. Websites and online services must obtain verifiable parental consent before collecting, using, or disclosing personal information from a child under 13.
Driver's Privacy Protection Act (DPPA) (1994) restricts the disclosure and use of personal information obtained from state motor vehicle records. It limits the use of this information for specific purposes, such as law enforcement activities or vehicle safety recalls.
Video Privacy Protection Act (VPPA) (1988) prohibits the disclosure of individuals' video rental or sale records without their consent. This law aims to safeguard people's viewing habits and protect their privacy.
The Cable Communications Policy Act of 1984 includes provisions for protecting cable television subscribers' privacy. It restricts the disclosure of personally identifiable information without authorization.
Fair Credit Reporting Act (FCRA) (1970) regulates consumer credit information collection, dissemination, and use. It ensures fairness, accuracy, and privacy in credit reporting by giving consumers the right to access and dispute their credit reports.
Telephone Consumer Protection Act (TCPA) (1991)combats unwanted calls by imposing restrictions on unsolicited telemarketing calls, automated dialing systems, and text messages sent to mobile phones without consent.
Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2023 (CAN-SPAM Act) establishes rules for commercial email, requiring senders to provide opt-out mechanisms and identify their messages as advertisements.
Family Educational Rights and Privacy Act (FERPA) (1974) protects the privacy of students' educational records. It grants students and their parents the right to inspect and amend these records while restricting their disclosure without consent.
State-Level Action
Many states are taking matters into their own hands with comprehensive data privacy laws. California, Virginia, and Colorado are leading the charge, with more states following suit. These laws often grant consumers rights to access, delete, and opt out of the sale of their personal information. Here are some of the critical state laws to consider:
California Consumer Privacy Act (CCPA) (2018) was a landmark piece of legislation establishing a new baseline for consumer data privacy rights in the US. It grants California residents the right to:
- Know what personal information is being collected about them.
- Know whether their personal information is sold or disclosed and to whom.
- Say no to the sale of their personal information.
- Access their data.
- Request a business to delete any personal information about them.
- Not be discriminated against for exercising their privacy rights.
Colorado Privacy Act (2021): Similar to the CCPA, it provides consumers with rights to manage their data and imposes obligations on businesses for data protection.
Connecticut Personal Data Privacy and Online Monitoring Act (2023) specifies consumer rights regarding personal data, online monitoring, and data privacy.
Delaware Personal Data Privacy Act (2022) outlines consumer rights and requirements for personal data protection.
Florida Digital Bill of Rights (2023) focuses on entities generating significant revenue from online advertising, outlining consumer privacy rights.
Indiana Consumer Data Protection Act (2023) details consumer rights and requirements for data protection.
Iowa Consumer Data Protection Act (2022) describes consumer rights and requirements for data protection.
Montana Consumer Data Privacy Act (2023) applies to entities conducting business in Montana, outlining consumer data protection requirements.
New Hampshire Privacy Act (2023): This act applies to entities conducting business in New Hampshire, outlining consumer data protection requirements.
New Jersey Data Protection Act (2023): This act applies to entities conducting business in New Jersey, outlining consumer data protection requirements.
Oregon Consumer Privacy Act (2022): This act details consumer rights and rules for data protection.
Tennessee Information Protection Act (2021) governs data protection and breach reporting.
Texas Data Privacy and Security Act (2023) describes consumer rights and data protection requirements for businesses.
Utah Consumer Privacy Act (2023) provides consumer rights and emphasizes data protection assessments and security measures.
Virginia Consumer Data Protection Act (2021) grants consumers rights to access, correct, delete, and opt out of their data processing.
Beyond US Borders: The Global Reach of Data Privacy
Data doesn't respect borders. The EU's General Data Protection Regulation (GDPR) is a robust international regulation that applies to any organization handling the data of EU residents. Understanding the GDPR's requirements for consent, data security, and data subject rights is essential for businesses operating globally.
Your Path to Compliance
Conquering the data privacy maze requires vigilance and a proactive approach. Here are some critical steps:
Map the Maze
Identify which federal and state laws apply to your business and understand their specific requirements. Conduct a comprehensive data inventory to understand what personal information you collect, store, and use.
Empower Your Customers
Develop clear and concise data privacy policies that outline your data collection practices and how you safeguard information. Make these policies readily available to your customers.
Embrace Transparency
Give your customers control over their data by providing mechanisms to access, delete, and opt out of data sharing. Be upfront about how you use their data and respect their choices.
Invest in Security Measures
Implement robust security measures to protect customer data from unauthorized access, disclosure, or destruction.
Stay Agile
The data privacy landscape is constantly evolving. Regularly review and update your policies and procedures to comply with emerging regulations. Appoint a team within your organization to stay abreast of these changes.
Wrapping Up
The data privacy landscape is complex and constantly evolving, but it doesn't have to be overwhelming. By understanding the key regulations, taking a proactive approach, and building a culture of compliance, you can emerge as a more vital, trusted organization. In today's data-driven world, prioritizing data privacy isn't just good practice – it's essential for building lasting customer relationships and achieving long-term success.
Data has undeniably become the new gold in the swiftly evolving digital transformation landscape. Organizations across the globe are mining this precious resource, aiming to extract actionable insights that can drive innovation, enhance customer experiences, and sharpen competitive edges. However, the journey to unlock the true value of data is fraught with challenges, often likened to navigating a complex labyrinth where every turn could lead to new discoveries or unforeseen obstacles. This journey necessitates a robust data infrastructure, a skilled ensemble of data engineers, analysts, and scientists, and a meticulous data consumption management process. Yet, as data operations teams forge ahead, making strides in harnessing the power of data, they frequently encounter a paradoxical scenario: the more progress they make, the more the demand for data escalates, leading to a cycle of growth pains and inefficiencies.
The Bottleneck: Data Governance as a Time Sink
One of the most significant bottlenecks in this cycle is the considerable amount of time and resources devoted to data governance tasks. Traditionally, data control and protection responsibility has been shouldered by data engineers, data architects and Database Administrators (DBAs). On the surface, this seems logical – these individuals maneuver data from one repository to another and possess the necessary expertise in SQL coding, a skill most tools require to grant and restrict access. But is this alignment of responsibilities the most efficient use of their time and talents?
The answer, increasingly, is no.
While data engineers, DBAs and data architects are undoubtedly skilled, their actual value lies in their ability to design complex data pipelines, craft intricate algorithms, and build sophisticated data models. Relegating them to mundane data governance tasks underutilizes their potential and diverts their focus from activities that could yield far greater strategic value.
Imagine the scenario: A data scientist, brimming with the potential to unlock groundbreaking customer insights through advanced machine learning techniques, finds themself bogged down in the mire of access control requests, data masking procedures, and security audit downloads.
This misallocation of expertise significantly hinders the ability of data teams to extract the true potential from the organization's data reserves.
The Solution: Embracing Data Governance Automation
Enter the paradigm shift: data governance automation. This transformative approach empowers organizations to delegate the routine tasks of data governance and security to dedicated teams equipped with no-code control and protection solutions.
Solutions like ALTR offer a platform that empowers data teams to quickly and easily check off complex data governance task including:
Implementing data access policies
Leverage automated, tag-based, column and row access controls on PII/PHI/PCI data.
Dynamic data masking
Protect sensitive data with column-based and row-based access policies and dynamic data masking and scale policy creation with attribute-based and tag-based access control.
Generating audit trails
Maintain a comprehensive data access and usage patterns record, facilitating security audits and regulatory compliance.
Activity monitoring
Receive real-time data activity monitoring, policy anomalies, and alerts and notifications.
Freed from the shackles of routine data governance tasks, data teams can pivot towards more strategic and value-driven initiatives. Here are some of the compelling opportunities that could unfold:
Advanced-Data Analytics and Insights Generation
With more time at their disposal, data teams can delve deeper into data, employing advanced analytics techniques and AI models to uncover previously elusive insights. This could lead to breakthrough innovations, more personalized customer experiences, and data-driven decision-making across the organization.
Data Democratization and Literacy Programs
Data teams can spearhead initiatives to democratize data access, enabling a broader base of users to engage with data directly. Organizations can cultivate a data-driven culture where insights fuel every department's decision-making processes by implementing intuitive, self-service analytics platforms and conducting data literacy workshops.
Data Infrastructure Optimization
Attention can be turned towards optimizing the data infrastructure for scalability, performance, and cost-efficiency. This includes adopting cloud-native services, containerization, and serverless architectures that can dynamically scale to meet the fluctuating demands of data workloads.
Innovative Data Products and Services
With the foundational tasks of data governance automated, data teams can focus on developing new data products and services. This could range from predictive analytics tools for internal use to data-driven applications that enhance customer engagement or open new revenue streams.
Collaborative Data Ecosystems
Finally, data teams could invest time in building collaborative ecosystems and forging partnerships with other organizations, academia, and open-source communities. These ecosystems can foster innovation, accelerate the adoption of best practices, and enhance the organization's capabilities through shared knowledge and resources.
Wrapping Up
Automating data governance tasks presents a golden opportunity for data teams to realign their focus toward activities that maximize the strategic value of data. By embracing this shift, organizations can alleviate the growing pains associated with data management and pave the way for a future where data becomes the linchpin of innovation, growth, and competitive advantage. The question then is not whether data teams should adopt data governance automation but how quickly they can do so to unlock their full potential.