During an import operation, the data pump import utility uses these files to locate each database object in the dump file set Aws data pipeline is a web service that you can use to automate the movement and transformation of data. Import can also be used to load a target database directly from a source database with no intervening dump files
This allows export and import operations to run concurrently, minimizing total elapsed time. To learn more and to find out how to migrate your existing workloads, see migrating workloads from aws data pipeline A common streaming pattern involves ingesting source data to create the initial datasets in a pipeline
Note aws data pipeline service is in maintenance mode and no new features or region expansions are planned