Anonymize PostgreSQL Database

This guide will show you how to anonymize an existing PostgreSQL database.

1) Import Data Model

Import Tables for Existing Projects

Click on Import Table to open the Import Tables dialog.

Import existing Table Schemas

b) For newly created Projects, the Import Dialog appears automatically after saving your project and loading from an existing PostgreSQL Database

Import Tables for New Projects

After saving your new project and loading data from an existing PostgreSQL database, the Import dialog will appear automatically.

Import Schema Dialog

The Import dialog window allows you to browse your PostgreSQL databases and tables. Here are the steps to follow:

  • Load Schema: Click on a table to load its schema, including columns and foreign keys (if available), as well as additional SQL definitions.

  • Browse Databases: Double-click on a database to view its available tables.

Selecting Tables

  • Select Tables: Click on the nodes of the tables you want to anonymize or work with and check the boxes next to them.

  • Import Tables: Click on Import into Data Model to import the selected tables into your data model.

Selecting the desired tables

2) Defining Fakers and Configurations

After importing the schemas into your DataFakery project, define the desired fakers and configurations to proceed with the anonymization process.

Project ready for Anonymization

In this example, we create two fakers for the columns Login ID and Job Title. Additionally, we add a suffix $$ to clearly mark the anonymized data. Click here for further information on setting up fakers and generators to anonymize your data.

Add Fakers to the Project

Click here for further information regarding Setting up Fakers and Generators to anonymize your data.

Generators / Fakers

3) Export the anonymized data

The Export section in the ribbon toolbar provides various options for exporting your data. Here are the functionalities available:

Export Options
  1. Generate Scripts

    • Click this option to generate scripts for your data. This is useful for creating SQL scripts or other types of data manipulation scripts based on your current dataset and configurations.

  2. Export to Database

    • Use this option to export your data directly to a database. This is helpful for seamlessly integrating your generated or transformed data into your database systems.

  3. Docker Containers

    • Click this option to manage and configure Docker containers for your database. This allows you to run your databases in isolated environments.

  4. Export to File

    • This option allows you to export your data to a file. This can be used for creating backups, sharing datasets, or importing the data into other applications.

Use these export options to manage and utilize your data effectively, whether you need scripts, database integration, Docker container management, or file exports.

Export to T-SQL-Files

this example, we choose Generate Scripts, which generates *.sql files in the desired folders. Since we only anonymized and need the table Employee, we select it exclusively.

Export Options

To export your data to T-SQL files, follow these steps:

  1. Generate Scripts: Click on the Generate Scripts icon in the ribbon toolbar.

  2. Select Folder: Choose the desired folder where the *.sql files will be saved.

  3. Select Table: Since we only need the anonymized Employee table, select Employee from the list of available tables.

  4. Export: Proceed with the script generation. The selected table's data will be exported to the specified folder as a T-SQL file.

This process ensures that only the necessary data, in this case, the Employee table, is exported, facilitating efficient data handling and usage.

Finished

The preview pane shows the content of the generated SQL script file, allowing you to review the changes and data insertions.

  • Carefully review the generated script to ensure it meets your requirements. Check for correct anonymization and data integrity.

  • Once satisfied with the evaluation, you can proceed to execute the script in your SQL environment to apply the changes to your database.

Last updated

Was this helpful?