top of page
  • Writer's picturepuru Nichesoft

Migration of MongoDB 4.0 to Oracle DB on cloud

Updated: May 9, 2023

Project Description:

This project is about migrating MongoDB 4.0DB to Oracle 19c DB on premise. The data from MongoDB to Oracle had to be migrated with the data integration(ETL) process. The data was migrated with 100% accuracy between source and target. The go-live was done with minimal downtime which was within the scheduled maintenance window defined by the customer.

Project Details:

  • Number of databases on MongoDB Cluster: 20

  • Data Size With compression: 2 TB

  • Data Size without compression : 4T

  • Collections: 400+

Project Duration:

4 months

Team size:

6 members

Tools used:

NicheSoft customized tools and scripts/automation, Mongo Change streams for real time data migration, Data comparison tools and scripts, ETL tools.

women seeing the laptop
Data engineer

Migration strategy:

  • Initial Requirement Gathering

  • Analyzing the statistics for each database, collection, field

  • Identifying the missing fields from the row/document

  • Defining the data type for each fields

  • Data Modeling - Preparing ER diagrams for target

  • Normalizing the collections

  • Identifying the required indexes

  • Developing custom scripts to migrate & validate history data

  • Developing Custom scripts & setting up tools to enable real-time streaming

  • Setting up target(Oracle) RDBMS for Development,QA,Perf,Staging

  • Migrating History data

  • Validating the source & target data

  • QA certification & sign off

  • Performance Validation , Certification & sign off

  • Migrating Stage and Production

  • Monitoring the performance

  • Enabling reverse migration for Rollback

  • Stage & production certification & sign off

  • Releasing/removing the resources

A parallel environment was created for Development, QA, Staging and Production. Real-time streaming of data was enabled for staging and production environments. This enabled us to reduce the go-live downtime and perform the parallel environment testing with high accuracy levels. This also gave the customer a high degree of confidence to go-live with the new environment. This included functionality, performance, and data accuracy. Monitoring was enabled and validated for different scenarios before the go-live activity.


  • Handling Fields/columns missing ( Dynamic schema )

  • Migrating application users & credentials stored in collections

  • Data modelling

  • Normalization of tables

  • Creating relationships between tables

  • Native compression

  • Data expiration

  • Migrating GridFS data

  • Data Encryption Decryption

  • Data migration challenges

4 views0 comments


bottom of page