What is the best approach to replicate data from Postgresql to DynamoDB?

I’m currently working on a web site which implements role-based access control to serve private pages to its end users.

Whether a user is allowed to access a given page is based on which roles are assigned to that user.

Management of users and role assigning is done through an administrative UI by web site admins.
Data of users and roles are currently stored in a Postgres database on admin infrastructure (not reachable from the web site application serving our end users).

Storing users and roles in Postgres allows the user/role admin UI to leverage RDBMS capabilities like pagination, sorting and filtering by any indexed field as well as joining queries like “What users is this role assigned to?”.

In contrast to that, the web site running on our frontend infrastructure needs to perform only one single query pattern: “Does current user has the role required to access the intended page?”
Also, as opposite to the admin UI, the web site is required to support a high number of requests per day.

Given the two above requirements (simple querying pattern and high scaling capability) it seems reasonable to keep a copy of user/role data in DynamoDB tables to fulfil the aforementioned frontend queries.

But that would require keeping tables in DynamoDB constantly synchronized to user/role data in Postgres as users are created, updated, deleted and roles are assigned to users.

That said, my question is: what would by a fast, simple and fault-tolerant approach to perform Postgres to DynamoDB replication?

I thought about user/role backend code emitting events like “user-created”, “user-deleted” or “role-assigned” to a SNS topic right after performing a successful update query on Postgres. Those events would in turn trigger one or more lambda functions that would perform the respective action on DynamoDB.

Thanks!

%d bloggers like this: