0% found this document useful (0 votes)
57 views

AWS Oracle DBA Notes

Uploaded by

naani kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views

AWS Oracle DBA Notes

Uploaded by

naani kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 5

EC2 to RDS Migration Notes

EC2 (any server which you have access to)

= pdb clone
= unplug & plug pdb
= RMAN

RDS: you can only access database from outside

= you cannot access the server on which RDS is hosted !! :(


= you can only access the 1 pdb which is created inside CDB (admin)
= you cannot create any new PDBs (sys user to CDB)
= we have access to PDB >> storage

EC2 >> Apps >> Oracle >> App specific Schemas (owns tables & other objects)

= DBA user —> administer the DB

= App user —> who owns the app data

= datapump schema / table migration (dump file)

= how to put dump file on RDS server when we don't have access to RDS server itself

= s3 bucket —> aws snowball (cost)

= sqlplus (EC2) >> database link RDS >> FILE_TRANSFER

= if you don't have access to rds server, where will you run impdp (OS)?

= impdp@rds (EC2)

RDS = good for small scale databases


EC2 = good for large scale databases

EC2 (physical) >> RDS

RDS >> EC2

One EC2 & RDS >> SCHEMA inside a PDB

= Create new user TOM <all privileges> FIN_PDB


= make sure tns entries are set for specific PDB

= TOM: 1 or 2 tables, VIEWs on tables

= please move TOM schema from EC2 to RDS

----------------------------------------

= Export TOM Schema from EC2

expdp directory=data_pump_dir dumpfile=tom.dmp logfile=tom_export.log schemas='TOM'

= Push dump file on RDS server

BEGIN

DBMS_FILE_TRANSFER.PUT_FILE(

source_directory_object => 'data_pump_dir',

source_file_name => 'tom.dmp',

destination_directory_object => 'DATA_PUMP_DIR',

destination_file_name => 'tom.dmp',

destination_database => 'AKPDB' );

END;

= Perform the import

SOURCE

------

TOM:TOM

impdp admin@akpdb DIRECTORY=DATA_PUMP_DIR dumpfile=tom.dmp logfile=tom_import.log

TOM:SCOTT
impdp admin@rdspdb DIRECTORY=DATA_PUMP_DIR dumpfile=scott.dmp logfile=import.log
remap_schema='TOM:SCOTT'

exec utl_file.fremove('DATA_PUMP_DIR','tom.dmp');

DBSID >> TNS ENTRY >> Database Link

AKPDB >> AKPDB >> AKPDB

FIN_PDB >> FIN_PDB >> FIN_PDB

<owner>_<server>_<DB>

GE_DO_HR_PDB

GE_AWS_HR_PDB

GE_IBM_HR_PDB

FIN_PDB >> MATT_AWS_FIN_PDB >> MATT_AWS_FIN_PDB

FIN_PDB >> PIUS_AWS_FIN_PDB >> PIUS_AWS_FIN_PDB

PHYSICAL VM TO EC2 Migration Notes

MIGRATION:

- Can I connect to source via putty


- Can I connect to target via putty
- Can source & target ping each other
Flow (Activity)

Flow —> Book uber

NOT DONE BY DBA - NETWORK TEAM

INTERNET >> Router >> VM (briged adapter) (LAPTOP)


INTERNET = IP + PORT(6600)

Router = Forward it to VM port 1521

INTERNET >> Router >> router >> router

183.82.112.100 >> Router (6600) >> VM (1521)

ARUN_VM_PDB1 =

(DESCRIPTION =

(ADDRESS_LIST =

(ADDRESS = (PROTOCOL = TCP)(HOST = 183.82.112.100)(PORT = 6600))

(CONNECT_DATA =

(SERVICE_NAME = ORA12C)

Buy public IP from ISP >> Router (4400) >> VM 1521

LISTENER =

(DESCRIPTION_LIST =

(DESCRIPTION =

(ADDRESS = (PROTOCOL = TCP)(HOST = 192.168.1.81)(PORT = 1521))

SID_LIST_LISTENER =

(SID_LIST =

(SID_DESC =
(ORACLE_HOME = /u01/app/oracle/product/19.3/db_home)

(SID_NAME = PDB1)

- Can I connect to source via putty


- Can I connect to target via putty
- Can source & target ping each other

Your system >> Source >> unplug into pdb_archive & scp to your system

Your system >> scp archive file to target >> plug it in

Source (VM) >> WINSCP Your system (staging server) WINSCP << Target /tmp

NON-CDB >> No direct connection << EC2 (CDB)

= RMAN Backup
= Full backup
= How to convert non-cdb into PDB

You might also like