Streams Toolkit
This document describes a simple setup of Oracle Streams in multi-master mode to be used in a “two live data center scenario”. However, the setup can be used in a traditional one-way streams setup as well.
Two methods of conflict resolution are also addressed. One pre-defined update handler and one custom delete handler. Because conflict resolution is very application functionality specific, the two methods are included only as an example.
The document will go through a step by step setup of one table in multi-master replication mode but the process can then be used to add more tables. The streams setup will be performed on a table by table basis for maximum flexibility and schema level or database level streams will not be addressed although those two methods do exist and can be useful depending on the requirements.
In addition, this setup will produce a streams environment with the capture process running on the source database which may or may note be applicable depending on the functional requirements.
For the purpose of this document one database will be called
US(TEST) and one
1) Global database name for US(TEST) : test.ims.att.com
Host name for US(TEST) : evpnusda.ims.att.com
Table name : SDRSUSER.MENU
2) Global database name for UK(EVPNUSDA) : dbevpn01.ims.att.com
Host name for US(TEST) : evpnusda.ims.att.com
Table name : SDRSUSER.MENU
1. Setup STREAMS database user.
This user and associate tablespace must exist. Several objects such as transaction queues are create in this schema. It is also a good idea to allocate plenty of space to ensure that a back log of transaction can exist in case of a network or database issue.
The create_user.sql script is responsible for creating the user and assigning it the proper STREAMS database privileges. It assumes that a Tablespace called STREAMS is already pre-created.
CREATE USER strmadmin
IDENTIFIED BY strmadmin
DEFAULT TABLESPACE streams_tbs
TEMPORARY TABLESPACE temp
QUOTA UNLIMITED ON streams_tbs;
GRANT CONNECT, RESOURCE, DBA TO strmadmin;
BEGIN
DBMS_STREAMS_AUTH.GRANT_ADMIN_PRIVILEGE(
grantee => 'STRMADMIN',
grant_privileges => true);
END;
/
GRANT SELECT_CATALOG_ROLE TO STRMADMIN;
GRANT SELECT ANY DICTIONARY TO STRMADMIN;
This needs to be done in both database nodes.
2. Create capture and apply streams queues.
A capture queue will contain log mined data (Logical Change Record – LCR) that has been captured and the apply queue will contain LCR to be applied.
Oracle recommends that the same queue is not used for capture and apply. This toolkit names the queues STREAMS_CAPTURE and STREAMS_APPLY for the purpose of clarity.
The create_queue.sql creates the queues using the DBMS_STREAMS_ASM package and this needs to be run on both databases.
3. Create database links
In this setup of streams with the capture process running on the source database, database links are necessary for the LCR propagation process. If the propagation was done by redo logs, then this step would not be needed.
It is important to name the database link to the same name as the corresponding global name of the remote database. The documentation states that global_names must be set to true but according to a metalink note this is a suggestion and not a requirement. Essentially, to ensure clarity in a complex environment with multiple database nodes.
The create_link.sql script will create the remote database link. It will ask for one argument which is the remote database global name. This names must be tns resolvable.
This needs to be setup in both databases.
AT US(TEST)
CONNECT strmadmin/strmadmin
drop database link dbevpn01.ims.att.com;
CREATE PUBLIC DATABASE LINK dbevpn01.ims.att.com
CONNECT TO strmadmin
IDENTIFIED BY strmadmin
USING 'dbevpn01.ims.att.com';
AT
CREATE PUBLIC DATABASE LINK TEST.IMS.ATT.COM
CONNECT TO STRMADMIN
IDENTIFIED BY STRMADMIN
USING 'TEST.IMS.ATT.COM';
4. Setup queues
Queues are needed to store changes and to propagate changes. Essentially, changes are mined from the Oracle log files and entered into queues.
The create_queue.sql script needs to be run in both databases to setup the queues.
EXEC DBMS_STREAMS_ADM.SET_UP_QUEUE(QUEUE_TABLE => 'STREAMS_CAPTURE', QUEUE_NAME=> 'STRMADMIN.STREAMS_CAPTURE');
EXEC DBMS_STREAMS_ADM.SET_UP_QUEUE(QUEUE_TABLE => 'STREAMS_APPLY', QUEUE_NAME => 'STRMADMIN.STREAMS_APPLY');
5. Supplemental logging
Although 10g adds on some supplemental logging, it is best to do this explicitly. Supplemental looting will allow the destination apply process to apply the changes accordingly.
The add_logging.sql script is used to perform this action.
alter table SDRSUSER.MENU add supplemental log data (primary key, unique)columns;
6. Capture process and Capture Rules
The capture process is responsible for mining the logfiles and storing the changes in the queues created earlier. This script needs to be run for each table essentially.
The add_tables.sql script is used to perform this action.
At US (TEST) &
connect streams/streams2007
BEGIN
DBMS_STREAMS_ADM.ADD_TABLE_RULES(
table_name => 'SDRSUSER.MENU',
streams_type => 'capture',
streams_name => 'capture_stream',
queue_name => 'strmadmin.streams_capture',
include_dml => true,
include_ddl => true,
inclusion_rule => true);
END;
/
7. Propagation Process
Similar to the capture process the propagation process needs to be run in both databases.
The propagation process is responsible for sending the captured changes to the destination database.
US(TEST) database: add_propagation_rule_us.sql
connect strmadmin/strmadmin
BEGIN
--DBMS_STREAMS_ADM.REMOVE_RULE (
-- RULE_NAME => NULL, STREAMS_NAME => 'US_TO_UK',STREAMS_TYPE => 'PROPAGATION
');
DBMS_STREAMS_ADM.ADD_TABLE_PROPAGATION_RULES(
table_name => 'XXXXXXXXX',
streams_name => 'TEST_TO_USDA',
source_queue_name => 'strmadmin.streams_capture',
destination_queue_name =>'strmadmin.streams_apply@DBEVPN01.EMEA.ATT.COM',
include_dml => true,
include_ddl => true,
source_database => 'TEST.IMS.ATT.COM',
inclusion_rule => true);
END;
/
connect strrmadmin/strmadmin
BEGIN
DBMS_STREAMS_ADM.ADD_TABLE_PROPAGATION_RULES(
table_name => 'XXXXXXXXX',
streams_name => 'USDA_TO_TEST',
source_queue_name => 'strmadmin.streams_capture',
destination_queue_name => 'strmadmin.streams_apply@TEST.IMS.ATT.COM',
include_dml => true,
include_ddl => true,
source_database => 'DBEVPN01.EMEA.ATT.COM',
inclusion_rule => true);
END;
/
8. SCN
The change SCN needs to be updated. This is needed for streams to determine which changes have been replicate and which have not. I.e. when log file changes re reviewed, the instantiation SCN is used to qualify or not.
In the
connect strmadmin/strmadmin
DECLARE
source_scn NUMBER;
BEGIN
source_scn := DBMS_FLASHBACK.GET_SYSTEM_CHANGE_NUMBER();
DBMS_APPLY_ADM.SET_TABLE_INSTANTIATION_SCN@DBEVPN01.IMS.ATT.COM (
source_object_name => 'sdrsuser.menu',
source_database_name => 'TEST.IMS.ATT.COM',
instantiation_scn => source_scn);
END;
/
In the
connect strmadmin/strmadmin
DECLARE
source_scn NUMBER;
BEGIN
source_scn := DBMS_FLASHBACK.GET_SYSTEM_CHANGE_NUMBER();
DBMS_APPLY_ADM.SET_TABLE_INSTANTIATION_SCN@TEST.IMS.ATT.COM (
source_object_name => 'SDRSUSER.MENU',
source_database_name => 'DBEVPN01..ATT.COM',
instantiation_scn => source_scn);
END;
/
9. Configure apply process
The create_apply.sql create the apply process. The apply process is an oracle background process that is responsible for applying the changes. This script needs to be run in both database and takes as input the source database. E.g. in the
Create_apply at TEST
connect streams/streams2007
BEGIN
DBMS_STREAMS_ADM.ADD_TABLE_RULES(
table_name => 'SDRSUSER.MENU',
streams_type => 'apply',
streams_name => 'apply_stream',
queue_name => 'strmadmin.streams_apply',
include_dml => true,
include_ddl => true,
source_database => 'DBEVPN01.IMS.ATT.COM',
inclusion_rule => true);
END;
/
Create_apply at EVPNUSDA
connect streams/streams2007
BEGIN
DBMS_STREAMS_ADM.ADD_TABLE_RULES(
table_name => 'SDRSUSER.MENU',
streams_type => 'apply',
streams_name => 'apply_stream',
queue_name => 'strmadmin.streams_apply',
include_dml => true,
include_ddl => true,
source_database => 'TEST.IMS.ATT.COM',
inclusion_rule => true);
END;
/
10. Start the capture process
To start the capture process:
start_capture.sql
connect streams/streams2007
BEGIN
DBMS_CAPTURE_ADM.START_CAPTURE(
capture_name =>'capture_stream');
END;
/
Run in both databases.
11. Start the apply process
To start the apply process:
start_apply.sql
connect streams/streams2007
BEGIN
DBMS_APPLY_ADM.SET_PARAMETER(
apply_name => 'apply_stream',
parameter => 'disable_on_error',
value => 'n');
END;
/
BEGIN
DBMS_APPLY_ADM.START_APPLY(
apply_name => 'apply_stream');
END;
/
remove.sql – Can be used to remove the streams configuration in a database.
conflict.sql – Example of conflict resolution
No comments:
Post a Comment