Migrate to PostHog Cloud

Last updated:

|Edit this page

Prior to starting a historical data migration, ensure you do the following:

  1. Create a project on our US or EU Cloud.
  2. Sign up to a paid product analytics plan on the billing page (historic imports are free but this unlocks the necessary features).
  3. Raise an in-app support request with the Data pipelines topic detailing where you are sending events from, how, the total volume, and the speed. For example, "we are migrating 30M events from a self-hosted instance to EU Cloud using the migration scripts at 10k events per minute."
  4. Wait for the OK from our team before starting the migration process to ensure that it completes successfully and is not rate limited.
  5. Set the historical_migration option to true when capturing events in the migration.

This guide is relevant to users wanting to migrate both:

  1. From a self-hosted PostHog instance to PostHog Cloud.
  2. Between PostHog Cloud regions (e.g. US -> EU Cloud).

Requirements

  1. An existing project, either on PostHog Cloud or on a self-hosted instance running at least 1.30.0. For upgrade instructions, take a look at this guide.
  2. A new PostHog Cloud project hosted in the region of your choice.

Approach

This migration has 3 steps:

  1. Migrate your metadata (projects, dashboards, insights, actions, cohorts, feature flags, experiments, annotations).

  2. Migrate your events. Texthis also creates the necessary person, person distinct ID, and related records.

  3. Switch tracking in your product to set up replication from the old project if needed and to start sending events to the new project.

Migrate your metadata

To migrate metadata like projects, dashboards, insights, actions, feature flags, and more, use the PostHog migrate metadata script. This requires:

  1. Installing TypeScript and ts-node. You can do this by running npm install -g typescript ts-node in your terminal.
  2. Your old instance personal API key with read access to the project.
  3. Your new cloud instance personal API key with write access to the project, which you can get from your project settings.

Note: This process has the following caveats:

  1. Every object's "created by" information will appear as if it was created by the user who created the personal API key.
  2. Every object's "created at" information will appear as if it was created at the time you ran this script.
  1. Clone the repo and cd into it
    Terminal
    git clone https://github.com/PostHog/posthog-migrate-meta
    cd posthog-migrate-meta
  2. Install the dependencies by running yarn
  3. Run the script
    Terminal
    ts-node index.ts --source [posthog instance you want to migrate from] --sourcekey [personal api key for that instance] --destination [posthog instance you want to migrate to.] --destinationkey [personal api key for destination instance]

For more information on the options see the migrate metadata repo's readme.

Migrate your events to Cloud

Before you start, transformations and filtering apps in the destination cloud project (e.g. GeoIP). Keeping these enabled may change the events you are migrating.

For more details about historical migrations, see our migration docs.

Migrating events from self-hosted to Cloud

To migrate your events, you can read data directly from your ClickHouse cluster and ingest the data with the Python library using our self-hosted migration tool.

First, clone the repo and install the requirements.

Terminal
git clone https://github.com/PostHog/posthog-migration-tools
cd posthog-migration-tools
pip3 install -r requirements.txt

Next, run the migration script with your ClickHouse details, PostHog details, start date, end date, and fetch limit.

Terminal
python3 ./migrate.py \
--clickhouse-url https://some.clickhouse.cluster:8443 \
--clickhouse-user default \
--clickhouse-password some-password \
--clickhouse-database posthog \
--team-id 1234 \
--posthog-url https://us.posthog.com \
--posthog-api-token "abx123" \
--start-date 2023-06-18T13:00:00Z \
--end-date 2023-06-18T13:10:00 \
--fetch-limit 10000

This script prints a "cursor" in case the migration fails. It can be used to resume from where it got to by adding the --cursor argument to the command above.

Terminal
python3 ./migrate.py \
--clickhouse-url https://some.clickhouse.cluster:8443 \
--clickhouse-user default \
--clickhouse-password some-password \
--clickhouse-database posthog \
--team-id 1234 \
--posthog-url https://us.posthog.com \
--posthog-api-token "abx123" \
--start-date 2023-06-18T13:00:00Z \
--end-date 2023-06-18T13:10:00 \
--fetch-limit 10000 \
--cursor the-cursor-value-from-the-output

Notes:

  • This script adds a $lib property of posthog-python, overriding any $lib property already set.
  • If the script fails for some reason, just run it again with the latest cursor. Some transient issues are solved by re-running the script.

Migrating events between Cloud instances (e.g. US -> EU Cloud)

You must raise a support ticket in-app with the Data pipelines topic for the PostHog team to do this migration for you. This option is only available to customers on the team or enterprise plan as it requires significant engineering time.

Switching tracking in your product

Now that we've migrated our events, the next step is to switch over tracking within your product to direct any new events to your new PostHog Cloud instance.

Note: To make sure your person properties get the latest values, don't switch over tracking until historical events have been migrated.

  1. Re-enable any apps that you disabled earlier (e.g. GeoIP).

  2. Begin swapping out your project API key and instance address in the product or site. Once done, events using the new API key and host will go to your new Cloud instance.

Migrating your custom transformations or destinations

For custom transformations:

  1. Check if we already have a transform that does what you need (fastest option). You can see the list of transformations here.

  2. Move this logic to your app before you send the event (also fast).

  3. If you can make your app generalizable enough that others can benefit, submit your app to the store. To do this, see the build docs.

For custom destinations:

  1. Check to see if we already have a destination or batch export that does what you need (fastest option). You can see the list of destinations and batch exports here.

  2. Convert your app to work as a webhook (also fast). These are currently in beta. See the details here.

  3. If you can make your app generalizable enough that others can benefit, submit your app to the store. To do this, see the build docs.

If the options above don't work and you were previously paying a substantial amount self-hosting, then email us at sales@posthog.com with a link to the public GitHub repo and we can see if it's appropriate as a private cloud app.

Questions?

Was this page useful?

Next article

Migrate from Amplitude to PostHog

Migrating from Amplitude is a two step process: Export your data from Amplitude using the organization settings export, Amplitude Export API , or the S3 export. Import data into PostHog using PostHog's Python SDK or batch API with the historical_migration option set to true . Other libraries don't support historical migrations yet. Exporting data from Amplitude There are three ways to export data from Amplitude. 1. Organization settings export The simplest way is to go to your project…

Read next article