We just shipped add-mcp: think npx skills but for MCPs. One command to install MCPs across all your editors and agents

The anon extension

Protecting sensitive data in Postgres databases

The anon extension (PostgreSQL Anonymizer) provides data masking and anonymization capabilities to protect sensitive data in Postgres databases. It helps protect personally identifiable information (PII) and other sensitive data, facilitating compliance with regulations such as GDPR.

note

This extension comes from the PostgreSQL Anonymizer open source project (postgresql_anonymizer). This is distinct from other tools such as pg_anon. The extension is installed using CREATE EXTENSION anon.

Looking for a practical guide?

For complete step-by-step workflows on anonymizing data in Neon branches, including manual procedures and GitHub Actions automation, see data anonymization.

Enable the extension

note

This extension is currently experimental and may change in future releases.

When using the Neon Console or API for anonymization workflows, the extension is enabled automatically. It can also be enabled manually using SQL commands.

Enable via SQL

When working with SQL-based workflows (such as using psql or other SQL clients), enable the anon extension in your Neon database by following these steps:

  1. Connect to your Neon database using either the Neon SQL Editor or an SQL client like psql

  2. Enable experimental extensions:

    SET neon.allow_unstable_extensions='true';
  3. Install the extension:

    CREATE EXTENSION IF NOT EXISTS anon;

tip

When using the Neon Console or API to create branches, the extension is enabled automatically. See the data anonymization workflow guide for details.

Masking rules

Masking rules define which data to mask and how to mask it using SQL syntax. These rules are applied using SECURITY LABEL SQL commands and stored within the database schema to implement the privacy by design principle.

Masking functions

PostgreSQL Anonymizer provides built-in functions for different anonymization requirements, including but not limited to:

Function TypeDescriptionExample
FakingGenerate realistic dataanon.fake_first_name() and anon.lorem_ipsum()
PseudonymizationCreate consistent and reversible fake dataanon.pseudo_email(seed)
RandomizationGenerate random valuesanon.random_int_between(10, 100) and anon.random_in_enum(enum_column)
Partial scramblingHide portions of stringsanon.partial(ip_address, 8, ''XXX.XXX'', 0) would change 192.168.1.100 to 192.168.XXX.XXX
NullificationReplace with static values or NULLMASKED WITH VALUE 'CONFIDENTIAL'
Noise additionAlter numerical values while maintaining distributionanon.noise(salary, 0.1) adds +/- 10% noise to the salary column
GeneralizationReplace specific values with broader categoriesanon.generalize_int4range(age, 10) would change 54 to [50,60)

Static masking

Static masking permanently modifies the original data in your tables. This approach is useful for creating anonymized copies of data when:

  • Migrating production data to development branches
  • Creating sanitized datasets for testing
  • Archiving data with sensitive information removed
  • Distributing data to third parties

Branch operations and static masking

When using Neon's branch features with static masking:

  • Creating a child branch copies all data as-is from the parent
  • Resetting a branch from the parent replaces all branch data with the parent's current state
  • In both cases, any previous anonymization is lost and must be reapplied

Practical examples

For complete implementation examples showing how to apply these masking functions in real workflows, see the data anonymization guide, which covers:

  • Creating and anonymizing development branches
  • Applying different masking strategies to protect sensitive data
  • Automating anonymization with GitHub Actions
  • Best practices and safety tips

Limitations

  • Neon currently only supports static masking with this extension
  • With static masking, branch reset operations restore original data, requiring anonymization to be run again
  • Additional pg_catalog functions cannot be declared as TRUSTED in Neon's implementation

Conclusion

This extension provides a toolkit for protecting sensitive data in Postgres databases. By defining appropriate masking rules, you can create anonymized datasets that maintain usability while protecting individual privacy.

Reference

Need help?

Join our Discord Server to ask questions or see what others are doing with Neon. For paid plan support options, see Support.

Last updated on

Was this page helpful?