Tuesday, January 4, 2011

Database refresh process

Hi,
Database refresh is done by adopting various technique.But the main idea is to get the fresh production data in your Development,Test or Performance Database environment so that Developers/QAs make use of that data and do needful changes and testing depending upon the requirement.Below are the steps which I followed.

REFRESH PROCESS 1:

In the refresh process 1,I have used the exp/imp utility and expdp/impdp utility to transfer the data from source to target Database.

Steps we followed for refresh:
Step 1:Checking the tablespaces if it exists and the tablespace sizes in Source and the Destination Databases for the below
------mentioned tablespaces:
Source:STARQA_GDC Destination:orcl(my desktop Database)
STAR02D,STAR02I,STAR01D & STAR01I
SELECT F.TABLESPACE_NAME,TO_CHAR ((T.TOTAL_SPACE - F.FREE_SPACE),'999,999') "USEDMB",
TO_CHAR (F.FREE_SPACE, '999,999') "FREEMB",
TO_CHAR (T.TOTAL_SPACE, '999,999') "TOTALMB",
TO_CHAR ((ROUND ((F.FREE_SPACE/T.TOTAL_SPACE)*100)),'999')||' %' FREE
FROM (SELECT TABLESPACE_NAME,
ROUND (SUM (BLOCKS*(SELECT VALUE/1024
FROM V$PARAMETER
WHERE NAME = 'db_block_size')/1024) ) FREE_SPACE
FROM DBA_FREE_SPACE
GROUP BY TABLESPACE_NAME ) F,
(
SELECT TABLESPACE_NAME,
ROUND (SUM (BYTES/1048576)) TOTAL_SPACE
FROM DBA_DATA_FILES
GROUP BY TABLESPACE_NAME ) T
WHERE F.TABLESPACE_NAME = T.TABLESPACE_NAME


Check if these tablespaces exists,if so,check size and add if required and send me the out put of below query in orcl.

SOURCE DATABASE:STARQA_GDC
--------------------
TABLESPACE_NAME USEDMB FREEMB TOTALMB FREE
------------------------------ -------- -------- -------- ------
STAR01D 18,055 14,673 32,728 45 %
STAR01I 2,067 933 3,000 31 %
STAR02D 32,706 2,004 34,710 6 %
STAR02I 3,003 1,497 4,500 33 %

DESTINATION DATABASE:ORCL
-----------------------
TABLESPACE_NAME USEDMB FREEMB TOTALMB FREE
------------------------------ -------- -------- -------- ------
STAR01D 7,729 11,371 19,100 60 %
STAR01I 1,898 143 2,041 7 %
STAR02D 23,813 1,136 24,949 5 %
STAR02I 2,969 159 3,128 5 %
Check the USEDMB column in SOURCE DATABASE(STARQA_GDC) and TOTALMB column in DESTINATION DATABASE(ORCL) carefully and verify the SPACE needed for the refresh,If the space is insufficient in particular TABLESPACE,Increase the size of the Tablespace for import operation to be successfull and hence Database refresh to be successfull.

Step 2:Drop the users whose Data needs to be refreshed.
Use the below scripts for this. This is located in /work/Rafijobs/Dropstarusers.sql
DROP USER STARTXN CASCADE;
DROP USER STARREP CASCADE;
DROP USER STARREPAPP CASCADE;
DROP USER STARMIG CASCADE;
DROP USER STARTXNAPP CASCADE;
Run the script Dropstarusers.sqlat SQL prompt
SQL>@Dropstarusers.sql

Step 3:Create the users whose Data needs to be refreshed.
Use the below scripts for this. This is located in /work/Rafijobs/createstarusers.sql or Take the script by going to the TOAD tool,create the users with required privileges and grants.
We need to create the 5 users STARTXN,STARREP,STARREPAPP,STARMIG & STARTXNAPP whose Data needs to be refresh.

Step 4:Copy the export dump files to the destination location by using WINSCP utility.

Step 5:Create the directory for the Datapump import:
create directory IMP_DP2_JUN as 'D:\GDC_21_JUNE_BKP';
grant read,write on directory imp_dp2_Jun to public;
(or)
grant read,write on directory imp_dp2_Jun to system;
Since,import is done with system user.

Step 6: Importing the dumpfile in desktop:
The import scripts are:
Step of importing dumpfile for 5 users:
create directory IMP_DP2_JUN as 'D:\GDC_21_JUNE_BKP';
grant read,write on directory imp_dp2_Jun to public;
Datapump import done for the dump file got from GDC team after unziping it .
impdp system/database directory=IMP_DP2_JUN dumpfile=STARMIG_210610.dmp logfile=STARMIG_210610.log
impdp system/database directory=IMP_DP2_JUN dumpfile=STARREP_210610.dmp logfile=STARREP_210610.log
impdp system/database directory=IMP_DP2_JUN dumpfile=STARREPAPP_210610.dmp logfile=STARREPAPP_210610.log
impdp system/database directory=IMP_DP2_JUN dumpfile=STARTXN_210610.dmp logfile=STARTXN_210610.log
impdp system/database directory=IMP_DP2_JUN dumpfile=STARTXNAPP_210610.dmp logfile=STARTXNAPP_210610.log

Step 7:Verify the log files for each import

Step 8:Exporting in target Box:
starmig:
exp system/database@orcl.world file='/work/Rafi/exp_orcl_starmig_23jun10.dmp' log=/work/Rafi/exp_orcl_starmig_23jun10.log owner=starmig statistics=none
vi imp_starmig.sh =>Add the exp script here
To run in background without interupt:
nohup sh imp_starmig.sh > a.out &
startxn:
exp system/database@orcl.world file='/work/Rafi/exp_orcl_startxn_23jun10.dmp' log=/work/Rafi/exp_orcl_startxn_23jun10.log owner=startxn statistics=none
nohup sh imp_startxn.sh > b.out &
startxnapp:
exp system/database@orcl.world file='/work/Rafi/exp_orcl_startxnapp_23jun10.dmp' log=/work/Rafi/exp_orcl_startxnapp_23jun10.log owner=startxnapp statistics=none
nohup sh imp_startxnapp.sh > c.out &
starrep:
exp system/database@orcl.world file='/work/Rafi/exp_orcl_starrep_25jun10.dmp' log=/work/Rafi/exp_orcl_starrep_25jun10.log owner=starrep statistics=none
nohup sh imp_starrep25.sh > r.out &
starrepapp:
exp system/database@orcl.world file='/work/Rafi/exp_orcl_starrepapp_23jun10.dmp' log=/work/Rafi/exp_orcl_starrepapp_23jun10.log owner=starrepapp statistics=none
nohup sh imp_starrepapp.sh > e.out &
Step 9: Importing in 36 box:
starmig:
imp system/star@STARDEV file='/work/Rafi/exp_orcl_starmig_23jun10.dmp' log=imp_starmig_stardev23jun10.log fromuser=starmig touser=starmig
nohup sh imp_starmigDEV.sh > f.out &
startxn:
imp system/star@STARDEV file='/work/Rafi/exp_orcl_startxn_23jun10.dmp' log=imp_startxn_stardev23jun10.log fromuser=startxn touser=startxn
nohup sh imp_startxnDEV.sh > g.out &
startxnapp:
imp system/star@STARDEV file='/work/Rafi/exp_orcl_startxnapp_23jun10.dmp' log=imp_startxnapp_stardev23jun10.log fromuser=startxnapp touser=startxnapp
nohup sh imp_startxnappDEV.sh > h.out &
starrep:
imp system/star@STARDEV file='/work/Rafi/exp_orcl_starrep_25jun10.dmp' log=imp_starrep_stardev25jun10.log fromuser=starrep touser=starrep
nohup sh imp_starrepDEV.sh > i.out &
starrepapp:
imp system/star@STARDEV file='/work/Rafi/exp_orcl_starrepapp_23jun10.dmp' log=imp_starrepapp_stardev23jun10.log fromuser=starrepapp touser=starrepapp
nohup sh imp_starrepappDEV.sh > j.out &

Step 10: Verified the objects and tables that are imported.

Connect to SQL PLUS with each user:
1)startxn:
SQL>Select count(*) from user_objects;
SQL>Select count(*) from user_tables;
Compared this with startxn Schema in STARQA_GDC Database.
2)Starrep:
SQL>Select count(*) from user_objects;
SQL>Select count(*) from user_tables;
Compared this with startrep Schema in STARQA_GDC Database.
3)Startxnapp:
SQL>Select count(*) from user_objects;
SQL>Select count(*) from user_tables;
Compared this with startxnapp Schema in STARQA_GDC Database.
4)Starrepapp:
SQL>Select count(*) from user_objects;
SQL>Select count(*) from user_tables;

Compared this with startrepapp Schema in STARQA_GDC Database.
5)Starmig:
SQL>Select count(*) from user_objects;
SQL>Select count(*) from user_tables;
Compared this with startrepapp Schema in STARQA_GDC Database.

Step 11: Compared the schemas in case of difference in the object.
Open the source and Destination Database Toad sessions separately.
If we find some schema difference Then go to toad utility and compare the schemas.Check for the objects and tables didn't got imported.In the TOAD utility go to the DBA tab and click on Compare schemas.Click the objects we need to compare.We will get here the objects which are less in particular Destination schemas.

Step 12:We Make use of exp utility to export the missing objects interactively.

REFRESH PROCESS 2:

In the refresh process2,the indirect method is avoided as we came across one parameter version=10.2.So when you do export with this parameter you can directly import without losing any data by using expdp and impdp utility.

Steps for the refresh from Source(11.1.0.7) to Target(11.1.0.6):(

Step 1:Checking the tablespaces if it exists and the tablespace sizes in Source and the Destination Databases for the below
------mentioned tablespaces:
Source:STARQA_GDC Target:STARTST
STAR02D,STAR02I,STAR01D & STAR01I
SELECT F.TABLESPACE_NAME,TO_CHAR ((T.TOTAL_SPACE - F.FREE_SPACE),'999,999') "USEDMB",
TO_CHAR (F.FREE_SPACE, '999,999') "FREEMB",
TO_CHAR (T.TOTAL_SPACE, '999,999') "TOTALMB",
TO_CHAR ((ROUND ((F.FREE_SPACE/T.TOTAL_SPACE)*100)),'999')||' %' FREE
FROM (SELECT TABLESPACE_NAME,
ROUND (SUM (BLOCKS*(SELECT VALUE/1024
FROM V$PARAMETER
WHERE NAME = 'db_block_size')/1024) ) FREE_SPACE
FROM DBA_FREE_SPACE
GROUP BY TABLESPACE_NAME ) F,
(
SELECT TABLESPACE_NAME,
ROUND (SUM (BYTES/1048576)) TOTAL_SPACE
FROM DBA_DATA_FILES
GROUP BY TABLESPACE_NAME ) T
WHERE F.TABLESPACE_NAME = T.TABLESPACE_NAME


Check if these tablespaces exists,if so,check size and add if required and send me the out put of below query in STARTST.

SOURCE DATABASE:STARQA_GDC
--------------------
TABLESPACE_NAME USEDMB FREEMB TOTALMB FREE
------------------------------ -------- -------- -------- ------
STAR01D 18,055 14,673 32,728 45 %
STAR01I 2,067 933 3,000 31 %
STAR02D 32,706 2,004 34,710 6 %
STAR02I 3,003 1,497 4,500 33 %

DESTINATION DATABASE:STARTST
-----------------------
TABLESPACE_NAME USEDMB FREEMB TOTALMB FREE
------------------------------ -------- -------- -------- ------
STAR01D 7,729 11,371 19,100 60 %
STAR01I 1,898 143 2,041 7 %
STAR02D 23,813 1,136 24,949 5 %
STAR02I 2,969 159 3,128 5 %
Check the USEDMB column in SOURCE DATABASE(STARQA_GDC) and TOTALMB column in TARGET DATABASE(STARTST) carefully and verify the SPACE needed for the refresh,If the space is insufficient in particular TABLESPACE,Increase the size of the Tablespace for import operation to be successfull and hence Database refresh to be successfull.

Step 2:Drop the users whose Data needs to be refreshed.
Use the below scripts for this. This is located in /work/Rafijobs/Dropstarusers.sql
DROP USER STARTXN CASCADE;
DROP USER STARREP CASCADE;
DROP USER STARREPAPP CASCADE;
DROP USER STARMIG CASCADE;
DROP USER STARTXNAPP CASCADE;
Run the script Dropstarusers.sql at SQL prompt
SQL>@Dropstarusers.sql

Step 3:Create the users whose Data needs to be refreshed.
Use the below scripts for this. This is located in /work/Rafijobs/createstarusers.sql or Take the script by going to the TOAD tool,create the users with required privileges and grants.
We need to create the 5 users STARTXN,STARREP,STARREPAPP,STARMIG & STARTXNAPP whose Data needs to be refresh.

Step 4:Copy the export dump files to the destination location using WinSCP.

Note:Step 5 is only possible when expdp is done with version=10.2, since the source Database version is 11.1.0.7 and target Database oracle version is 11.1.0.6,So from higher to lower version import is only possible when export is done with version=10.2 parameter.(For expdp refer refresh method 1).

Step 5:Create the directory for the Datapump import:
create directory IMP_DP2_JUN as 'D:\GDC_21_JUNE_BKP';
grant read,write on directory imp_dp2_Jun to public;
(or)
grant read,write on directory imp_dp2_Jun to system;

Since,import is done with system user.
impdp system/database directory=IMP_DP2_JUN dumpfile=STARMIG_210610.dmp logfile=STARMIG_210610.log version=10.2
impdp system/database directory=IMP_DP2_JUN dumpfile=STARREP_210610.dmp logfile=STARREP_210610.log version=10.2
impdp system/database directory=IMP_DP2_JUN dumpfile=STARREPAPP_210610.dmp logfile=STARREPAPP_210610.log version=10.2
impdp system/database directory=IMP_DP2_JUN dumpfile=STARTXN_210610.dmp logfile=STARTXN_210610.log version=10.2
impdp system/database directory=IMP_DP2_JUN dumpfile=STARTXNAPP_210610.dmp logfile=STARTXNAPP_210610.log version=10.2

Step 6:Verify the log files for each import

Step 7: Verified the objects and tables that are imported.
Connect to SQL PLUS with each user:
1)startxn:
SQL>Select count(*) from user_objects;
SQL>Select count(*) from user_tables;
Compared this with startxn Schema in STARQA_GDC Database.
2)Starrep:
SQL>Select count(*) from user_objects;
SQL>Select count(*) from user_tables;
Compared this with startrep Schema in STARQA_GDC Database.
3)Startxnapp:
SQL>Select count(*) from user_objects;
SQL>Select count(*) from user_tables;
Compared this with startxnapp Schema in STARQA_GDC Database.
4)Starrepapp:
SQL>Select count(*) from user_objects;
SQL>Select count(*) from user_tables;

Compared this with startrepapp Schema in STARQA_GDC Database.
5)Starmig:
SQL>Select count(*) from user_objects;
SQL>Select count(*) from user_tables;
Compared this with startrepapp Schema in STARQA_GDC Database.

Step 8: Compared the schemas in case of difference in the object.
Open the source and Destination Database Toad sessions separately.
If we find some schema difference Then go to toad utility and compare the schemas.Check for the objects and tables didn't got imported.In the TOAD utility go to the DBA tab and click on Compare schemas.Click the objects we need to compare.We will get here the objects which are less in particular Destination schemas.

Hope this helps.


Best regards,

Rafi.

2 comments:

  1. similar post here :

    http://chandu208.blogspot.com/2012/01/data-pump-schema-refresh.html

    ReplyDelete
  2. I really appreciate information shared above. It’s of great help. If someone want to learn Online (Virtual) instructor lead live training in Oracle DBA TECHNOLOGY , kindly contact us http://www.maxmunus.com/contact
    MaxMunus Offer World Class Virtual Instructor-led training on TECHNOLOGY. We have industry expert trainer. We provide Training Material and Software Support. MaxMunus has successfully conducted 100000+ pieces of training in India, USA, UK, Australia, Switzerland, Qatar, Saudi Arabia, Bangladesh, Bahrain and UAE etc.
    For Demo Contact us.
    Pratik Shekhar
    MaxMunus
    E-mail: pratik@maxmunus.com
    Ph:(0) +91 9066268701
    http://www.maxmunus.com/

    ReplyDelete