Discussion:
Oracle multi Tb refresh for UAT from Prod
"Sanjay Mishra" (Redacted sender "smishra_97" for DMARC)
2018-11-16 16:21:33 UTC
Permalink
Can someone share the process used in your experience/organization where you have several multi TB database and need to frequently refresh UAT for performance testing ? I am looking not much from Masking the data which are sometimes required but based on dropping prod user and adding Test user and application schema back ? Appreciate If someone can share any script used for sync user/password which can be main challenge as other Registration with OEM/Catalog can be handled easily.
TxSanjay
Upendra nerilla
2018-11-16 16:57:51 UTC
Permalink
Are you looking at any 3rd party products? If so, you may want to look into a product Delphix.
It appears to be a candidate for what you are looking for. It can do data mask as well as refresh environments quickly.
Tim Gorman is in the list - works for Delphix - reach out to him for any technical details..


________________________________
From: oracle-l-***@freelists.org <oracle-l-***@freelists.org> on behalf of Sanjay Mishra <dmarc-***@freelists.org>
Sent: Friday, November 16, 2018 11:21 AM
To: Oracle-L Freelists
Subject: Oracle multi Tb refresh for UAT from Prod

Can someone share the process used in your experience/organization where you have several multi TB database and need to frequently refresh UAT for performance testing ? I am looking not much from Masking the data which are sometimes required but based on dropping prod user and adding Test user and application schema back ? Appreciate If someone can share any script used for sync user/password which can be main challenge as other Registration with OEM/Catalog can be handled easily.

Tx
Sanjay
"Sanjay Mishra" (Redacted sender "smishra_97" for DMARC)
2018-11-16 17:50:15 UTC
Permalink
Upendra
I cannot have UAt on NFs with delphix as it has to be on same platform architecture like Prod with same configuration. We are using delphix for Dev environment and also these prod don't have any Data masking requirement which can make the process more complex. Currently doing expdp for set or subset of table but now looking for full DB to match for some Critical new interface where multiple UAT env has to be tested with prod equivalent data for all application schema.
TxSanjay On Friday, November 16, 2018, 11:58:43 AM EST, Upendra nerilla <***@hotmail.com> wrote:

#yiv7029178779 P {margin-top:0;margin-bottom:0;}Are you looking at any 3rd party products? If so, you may want to look into a product Delphix. It appears to be a candidate for what you are looking for. It can do data mask as well as refresh environments quickly.Tim Gorman is in the list - works for Delphix - reach out to him for any technical details.. 

From: oracle-l-***@freelists.org <oracle-l-***@freelists.org> on behalf of Sanjay Mishra <dmarc-***@freelists.org>
Sent: Friday, November 16, 2018 11:21 AM
To: Oracle-L Freelists
Subject: Oracle multi Tb refresh for UAT from Prod Can someone share the process used in your experience/organization where you have several multi TB database and need to frequently refresh UAT for performance testing ? I am looking not much from Masking the data which are sometimes required but based on dropping prod user and adding Test user and application schema back ? Appreciate If someone can share any script used for sync user/password which can be main challenge as other Registration with OEM/Catalog can be handled easily.
TxSanjay
"Sanjay Mishra" (Redacted sender "smishra_97" for DMARC)
2018-11-16 17:56:04 UTC
Permalink
Tim
Thanks for sharing the script and providing all detailed information. This is very good starting point and can be to add any extra stuff like you mentioned for dropping prod user etc or any masking. 
thanks again for your time.Sanjay
On Friday, November 16, 2018, 12:51:11 PM EST, Sanjay Mishra <dmarc-***@freelists.org> wrote:

Upendra
I cannot have UAt on NFs with delphix as it has to be on same platform architecture like Prod with same configuration. We are using delphix for Dev environment and also these prod don't have any Data masking requirement which can make the process more complex. Currently doing expdp for set or subset of table but now looking for full DB to match for some Critical new interface where multiple UAT env has to be tested with prod equivalent data for all application schema.
TxSanjay On Friday, November 16, 2018, 11:58:43 AM EST, Upendra nerilla <***@hotmail.com> wrote:

#yiv8199346004 P {margin-top:0;margin-bottom:0;}Are you looking at any 3rd party products? If so, you may want to look into a product Delphix. It appears to be a candidate for what you are looking for. It can do data mask as well as refresh environments quickly.Tim Gorman is in the list - works for Delphix - reach out to him for any technical details.. 

From: oracle-l-***@freelists.org <oracle-l-***@freelists.org> on behalf of Sanjay Mishra <dmarc-***@freelists.org>
Sent: Friday, November 16, 2018 11:21 AM
To: Oracle-L Freelists
Subject: Oracle multi Tb refresh for UAT from Prod Can someone share the process used in your experience/organization where you have several multi TB database and need to frequently refresh UAT for performance testing ? I am looking not much from Masking the data which are sometimes required but based on dropping prod user and adding Test user and application schema back ? Appreciate If someone can share any script used for sync user/password which can be main challenge as other Registration with OEM/Catalog can be handled easily.
TxSanjay
Chris Taylor
2018-11-16 22:08:29 UTC
Permalink
Sanjay,

What is your storage device under your primary db? (or even your standby
db?)

We snap clone our Prod (or Standby) db on Pure storage but the same
technology exists for EMC etc.

Snapclones are bad fast and appear to be full size clones to the user/dba.
But is a fraction of the space on the storage device.

Chris
Post by "Sanjay Mishra" (Redacted sender "smishra_97" for DMARC)
Tim
Thanks for sharing the script and providing all detailed information. This
is very good starting point and can be to add any extra stuff like you
mentioned for dropping prod user etc or any masking.
thanks again for your time.
Sanjay
On Friday, November 16, 2018, 12:51:11 PM EST, Sanjay Mishra <
Upendra
I cannot have UAt on NFs with delphix as it has to be on same platform
architecture like Prod with same configuration. We are using delphix for
Dev environment and also these prod don't have any Data masking requirement
which can make the process more complex. Currently doing expdp for set or
subset of table but now looking for full DB to match for some Critical new
interface where multiple UAT env has to be tested with prod equivalent data
for all application schema.
Tx
Sanjay
On Friday, November 16, 2018, 11:58:43 AM EST, Upendra nerilla <
Are you looking at any 3rd party products? If so, you may want to look
into a product Delphix.
It appears to be a candidate for what you are looking for. It can do data
mask as well as refresh environments quickly.
Tim Gorman is in the list - works for Delphix - reach out to him for any
technical details..
------------------------------
*Sent:* Friday, November 16, 2018 11:21 AM
*To:* Oracle-L Freelists
*Subject:* Oracle multi Tb refresh for UAT from Prod
Can someone share the process used in your experience/organization where
you have several multi TB database and need to frequently refresh UAT for
performance testing ? I am looking not much from Masking the data which are
sometimes required but based on dropping prod user and adding Test user and
application schema back ? Appreciate If someone can share any script used
for sync user/password which can be main challenge as other Registration
with OEM/Catalog can be handled easily.
Tx
Sanjay
"Sanjay Mishra" (Redacted sender "smishra_97" for DMARC)
2018-11-16 23:07:57 UTC
Permalink
Chris
I was also checking for the same but never used it. The environment where I need the help like both Prod and UAt are Exadata V6 different OVM and so was looking the link online  and found some details and really appreciate if you can share your experience and reference doc. I can do the test with ASM diskgroup as had plenty of space to test
I was not able to get more details like1. What are the impact for this on Prod environment while collecting the changed block? We are all Multi tenant Architecture and so had to refresh PDB using clone and are still on 12.1. So not sure if need to make pdb read onlyu like done for pdb clone in 12.1.2. Storage requiremnt like for test Master or clones ?3. Can we able to use Standby for any setup ? We also have Dedicated Exadata DR ? I read some details while exploring on best solution as need to do multiple such refresh in coming months on regular basis.
So if you can share your experience with Exadata ASM environment and reference docs.
TxSanjay



On Friday, November 16, 2018, 5:09:50 PM EST, Chris Taylor <***@gmail.com> wrote:

Sanjay,
What is your storage device under your primary db?  (or even your standby db?)
We snap clone our Prod (or Standby) db on Pure storage but the same technology exists for EMC etc.
Snapclones are bad fast and appear to be full size clones to the user/dba.  But is a fraction of the space on the storage device.
Chris
On Fri, Nov 16, 2018, 12:56 PM Sanjay Mishra <dmarc-***@freelists.org wrote:

Tim
Thanks for sharing the script and providing all detailed information. This is very good starting point and can be to add any extra stuff like you mentioned for dropping prod user etc or any masking. 
thanks again for your time.Sanjay
On Friday, November 16, 2018, 12:51:11 PM EST, Sanjay Mishra <dmarc-***@freelists.org> wrote:

Upendra
I cannot have UAt on NFs with delphix as it has to be on same platform architecture like Prod with same configuration. We are using delphix for Dev environment and also these prod don't have any Data masking requirement which can make the process more complex. Currently doing expdp for set or subset of table but now looking for full DB to match for some Critical new interface where multiple UAT env has to be tested with prod equivalent data for all application schema.
TxSanjay On Friday, November 16, 2018, 11:58:43 AM EST, Upendra nerilla <***@hotmail.com> wrote:

Are you looking at any 3rd party products? If so, you may want to look into a product Delphix. It appears to be a candidate for what you are looking for. It can do data mask as well as refresh environments quickly.Tim Gorman is in the list - works for Delphix - reach out to him for any technical details.. 

From: oracle-l-***@freelists.org <oracle-l-***@freelists.org> on behalf of Sanjay Mishra <dmarc-***@freelists.org>
Sent: Friday, November 16, 2018 11:21 AM
To: Oracle-L Freelists
Subject: Oracle multi Tb refresh for UAT from Prod Can someone share the process used in your experience/organization where you have several multi TB database and need to frequently refresh UAT for performance testing ? I am looking not much from Masking the data which are sometimes required but based on dropping prod user and adding Test user and application schema back ? Appreciate If someone can share any script used for sync user/password which can be main challenge as other Registration with OEM/Catalog can be handled easily.
TxSanjay
Mladen Gogala
2018-11-17 07:35:27 UTC
Permalink
Hi Chris!

You are aware that by using snapshot, you will burden your production
disks? Snapshot is an array of pointers. The algorithm for snapshot is
optimized CoW ("Copy on Write", not the main ingredient in a burger).
When the source database changes, the blocks are copied to a spare
location, so that SAN can keep the snapshot consistent. If the data
doesn't change, you are reading your original data through a pointer.
Most of the good SAN equipment also has "clone" possibility (SnapVault,
SRDF, HUR). In particular, Pure also has the ability to create a full
volume from snapshot.

Regards
Post by Tim Gorman
Sanjay,
What is your storage device under your primary db?  (or even your
standby db?)
We snap clone our Prod (or Standby) db on Pure storage but the same
technology exists for EMC etc.
Snapclones are bad fast and appear to be full size clones to the
user/dba.  But is a fraction of the space on the storage device.
Chris
On Fri, Nov 16, 2018, 12:56 PM Sanjay Mishra
Tim
Thanks for sharing the script and providing all detailed
information. This is very good starting point and can be to add
any extra stuff like you mentioned for dropping prod user etc or
any masking.
thanks again for your time.
Sanjay
On Friday, November 16, 2018, 12:51:11 PM EST, Sanjay Mishra
Upendra
I cannot have UAt on NFs with delphix as it has to be on same
platform architecture like Prod with same configuration. We are
using delphix for Dev environment and also these prod don't have
any Data masking requirement which can make the process more
complex. Currently doing expdp for set or subset of table but now
looking for full DB to match for some Critical new interface where
multiple UAT env has to be tested with prod equivalent data for
all application schema.
Tx
Sanjay
On Friday, November 16, 2018, 11:58:43 AM EST, Upendra nerilla
Are you looking at any 3rd party products? If so, you may want to
look into a product Delphix.
It appears to be a candidate for what you are looking for. It can
do data mask as well as refresh environments quickly.
Tim Gorman is in the list - works for Delphix - reach out to him
for any technical details..
------------------------------------------------------------------------
*Sent:* Friday, November 16, 2018 11:21 AM
*To:* Oracle-L Freelists
*Subject:* Oracle multi Tb refresh for UAT from Prod
Can someone share the process used in your experience/organization
where you have several multi TB database and need to frequently
refresh UAT for performance testing ? I am looking not much from
Masking the data which are sometimes required but based on
dropping prod user and adding Test user and application schema
back ? Appreciate If someone can share any script used for sync
user/password which can be main challenge as other Registration
with OEM/Catalog can be handled easily.
Tx
Sanjay
--
Mladen Gogala
Database Consultant
Tel: (347) 321-1217
Chris Taylor
2018-11-17 08:39:24 UTC
Permalink
Mladen,

We use point in time snapshots of a protection group. A protection group
is the set of Prod volumes presented to the Production server.

We refresh 6 databases from Prod within every 2 weeks where Prod is ~80 TB
currently. We do not keep the snapshots current (in-sync as you say) with
the Production volumes. When we want a refresh we basically "snapshot" the
data at that point in time, by snapping the protection group, blowing away
the existing dev/qa/staging volumes and represent them as fresh copies.

The only overhead on the storage are the size of the snapshots (the deltas
generated in dev/qa/staging).

Our change data rate on Prod approaches 0.5 TB / hour during nightly
processes and we generate (on average) 5 TB (with peaks of 6 TB) of
archivelogs per 24 hours.

Our Pure M70 array maintains latencies at .4/.5 ms and we see very little
impact on performance (I'm not on my work computer to give you exact
numbers for the utilization and IOPS).

Thanks,
Chris
Post by Mladen Gogala
Hi Chris!
You are aware that by using snapshot, you will burden your production
disks? Snapshot is an array of pointers. The algorithm for snapshot is
optimized CoW ("Copy on Write", not the main ingredient in a burger). When
the source database changes, the blocks are copied to a spare location, so
that SAN can keep the snapshot consistent. If the data doesn't change, you
are reading your original data through a pointer. Most of the good SAN
equipment also has "clone" possibility (SnapVault, SRDF, HUR). In
particular, Pure also has the ability to create a full volume from snapshot.
Regards
Sanjay,
What is your storage device under your primary db? (or even your standby
db?)
We snap clone our Prod (or Standby) db on Pure storage but the same
technology exists for EMC etc.
Snapclones are bad fast and appear to be full size clones to the
user/dba. But is a fraction of the space on the storage device.
Chris
Post by "Sanjay Mishra" (Redacted sender "smishra_97" for DMARC)
Tim
Thanks for sharing the script and providing all detailed information.
This is very good starting point and can be to add any extra stuff like you
mentioned for dropping prod user etc or any masking.
thanks again for your time.
Sanjay
On Friday, November 16, 2018, 12:51:11 PM EST, Sanjay Mishra <
Upendra
I cannot have UAt on NFs with delphix as it has to be on same platform
architecture like Prod with same configuration. We are using delphix for
Dev environment and also these prod don't have any Data masking requirement
which can make the process more complex. Currently doing expdp for set or
subset of table but now looking for full DB to match for some Critical new
interface where multiple UAT env has to be tested with prod equivalent data
for all application schema.
Tx
Sanjay
On Friday, November 16, 2018, 11:58:43 AM EST, Upendra nerilla <
Are you looking at any 3rd party products? If so, you may want to look
into a product Delphix.
It appears to be a candidate for what you are looking for. It can do data
mask as well as refresh environments quickly.
Tim Gorman is in the list - works for Delphix - reach out to him for any
technical details..
------------------------------
*Sent:* Friday, November 16, 2018 11:21 AM
*To:* Oracle-L Freelists
*Subject:* Oracle multi Tb refresh for UAT from Prod
Can someone share the process used in your experience/organization where
you have several multi TB database and need to frequently refresh UAT for
performance testing ? I am looking not much from Masking the data which are
sometimes required but based on dropping prod user and adding Test user and
application schema back ? Appreciate If someone can share any script used
for sync user/password which can be main challenge as other Registration
with OEM/Catalog can be handled easily.
Tx
Sanjay
--
Mladen Gogala
Database Consultant
Tel: (347) 321-1217
n***@gmail.com
2018-11-17 10:36:55 UTC
Permalink
CoW is one of *two* alternative options for creating snapshots, PURE, IBM
and no doubt other vendors use Redirect on Write which doesn't impose the
same write penalty see for example
https://storageswiss.com/2016/04/01/snapshot-101-copy-on-write-vs-redirect-on-write/
.
Post by Mladen Gogala
Hi Chris!
You are aware that by using snapshot, you will burden your production
disks? Snapshot is an array of pointers. The algorithm for snapshot is
optimized CoW ("Copy on Write", not the main ingredient in a burger). When
the source database changes, the blocks are copied to a spare location, so
that SAN can keep the snapshot consistent. If the data doesn't change, you
are reading your original data through a pointer. Most of the good SAN
equipment also has "clone" possibility (SnapVault, SRDF, HUR). In
particular, Pure also has the ability to create a full volume from snapshot.
Regards
Sanjay,
What is your storage device under your primary db? (or even your standby
db?)
We snap clone our Prod (or Standby) db on Pure storage but the same
technology exists for EMC etc.
Snapclones are bad fast and appear to be full size clones to the
user/dba. But is a fraction of the space on the storage device.
Chris
Post by "Sanjay Mishra" (Redacted sender "smishra_97" for DMARC)
Tim
Thanks for sharing the script and providing all detailed information.
This is very good starting point and can be to add any extra stuff like you
mentioned for dropping prod user etc or any masking.
thanks again for your time.
Sanjay
On Friday, November 16, 2018, 12:51:11 PM EST, Sanjay Mishra <
Upendra
I cannot have UAt on NFs with delphix as it has to be on same platform
architecture like Prod with same configuration. We are using delphix for
Dev environment and also these prod don't have any Data masking requirement
which can make the process more complex. Currently doing expdp for set or
subset of table but now looking for full DB to match for some Critical new
interface where multiple UAT env has to be tested with prod equivalent data
for all application schema.
Tx
Sanjay
On Friday, November 16, 2018, 11:58:43 AM EST, Upendra nerilla <
Are you looking at any 3rd party products? If so, you may want to look
into a product Delphix.
It appears to be a candidate for what you are looking for. It can do data
mask as well as refresh environments quickly.
Tim Gorman is in the list - works for Delphix - reach out to him for any
technical details..
------------------------------
*Sent:* Friday, November 16, 2018 11:21 AM
*To:* Oracle-L Freelists
*Subject:* Oracle multi Tb refresh for UAT from Prod
Can someone share the process used in your experience/organization where
you have several multi TB database and need to frequently refresh UAT for
performance testing ? I am looking not much from Masking the data which are
sometimes required but based on dropping prod user and adding Test user and
application schema back ? Appreciate If someone can share any script used
for sync user/password which can be main challenge as other Registration
with OEM/Catalog can be handled easily.
Tx
Sanjay
--
Mladen Gogala
Database Consultant
Tel: (347) 321-1217
--
Niall Litchfield
Oracle DBA
http://www.orawin.info
Mladen Gogala
2018-11-18 13:15:18 UTC
Permalink
Hi Niall,

I know that. CoW triples your I/O rate on write while "redirect on
write" remaps the original LUN to the news blocks. That is what I was
referring to as "optimized CoW". Everybody knows what CoW is, but not
everybody is familiar with redirect on write. A little inaccuracy for
the sake of clarity, same as the Niels Bohr model of atom.

Regards
Post by n***@gmail.com
CoW is one of *two* alternative options for creating snapshots, PURE,
IBM and no doubt other vendors use Redirect on Write which doesn't
impose the same write penalty see for example
https://storageswiss.com/2016/04/01/snapshot-101-copy-on-write-vs-redirect-on-write/
.
--
Mladen Gogala
Database Consultant
Tel: (347) 321-1217
Tim Gorman
2018-11-16 17:14:17 UTC
Permalink
Sanjay,

Here are two shell-scripts that I use when refreshing Delphix "virtual
databases" for Oracle based on two common requirements...

1. Preserve non-prod database and application account passwords across
refresh
2. Preserve non-prod database link definitions across refresh


Some background:  when you initially provision a database clone, you
must invariably change account passwords (so that production passwords
aren't exposed in non-production) and change the definition of database
links (so that production databases aren't corrupted by non-prod
activities).  For a variety of reasons, this might be a manual process,
although many folks have it automated.

Regardless, when the database clone is refreshed later, it might be
necessary to repeat (and other) changes, and things can get messy. For
example, after the initial cloning, DBAs might set account passwords to
non-prod defaults, but developers and/or testers might change these
non-prod default values for many reasons.  So what is really needed is
not to re-execute the same procedures performed after the initial
cloning, but simply to preserve what existed prior to the refresh
operation and automatically re-apply those settings after the refresh is
complete.

So, the attached shell-script "ora_vdb_prerefresh.sh" is intended to be
called from a Delphix "pre-refresh" hook. "Hooks" are similar
programmatic callouts, similar to database triggers or "user exits". 
This script saves off existing database account passwords by generating
a SQL*Plus script, and then it saves off database link definitions using
DataPump export.

Then, the attached shell-script "ora_vdb_postrefresh.sh" is intended to
be called from a Delphix "post-refresh" hook.  This script checks to see
if a SQL*Plus script was generated and, if so, executes it to re-apply
all account passwords.  Then, if a DataPump export file exists, the
script calls DataPump import to re-apply the database link definitions.

You mentioned "/dropping prod user and adding Test user and application
schema back/", so that may or may not be covered by the existing logic. 
Certainly, adding an entire schema back can be performed with another
set of calls to DataPump?

Please note that the attached scripts have ".txt" file-extensions to
avoid freaking out email filters, and of course these extensions are
intended to be removed on saving.

Disclaimers:  Please realize that these scripts are merely starting
points or templates, not complete solutions.  They work fine in my lab
environment and at a couple of my customers, but they don't necessarily
do exactly what you want or won't work for you.  If you use them, you'll
need to take ownership of them, adapt them to your environment, and
there is no warranty, use at your own risk.

Hope this helps...

-Tim


On 11/16/18 08:21, Sanjay Mishra (Redacted sender smishra_97 for DMARC)
Post by "Sanjay Mishra" (Redacted sender "smishra_97" for DMARC)
Can someone share the process used in your experience/organization
where you have several multi TB database and need to frequently
refresh UAT for performance testing ? I am looking not much from
Masking the data which are sometimes required but based on dropping
prod user and adding Test user and application schema back
? Appreciate If someone can share any script used for sync
user/password which can be main challenge as other Registration with
OEM/Catalog can be handled easily.
Tx
Sanjay
Mark W. Farnham
2018-11-16 19:34:24 UTC
Permalink
This is extremely valuable, both the text explanation and the scripts. Getting this right saves huge effort and errors.



I skimmed it and didn’t see the NON-DB equivalent of link preservation for printer names. IF you have printer names embedded in the database just as you have different link definitions, beware accidentally shipping off duplicate “build to order” print jobs to the production manufacturing lines. That’s only hilarious in concept.



mwf



From: oracle-l-***@freelists.org [mailto:oracle-l-***@freelists.org] On Behalf Of Tim Gorman
Sent: Friday, November 16, 2018 12:14 PM
To: dmarc-***@freelists.org; Oracle-L Freelists
Subject: Re: Oracle multi Tb refresh for UAT from Prod



Sanjay,

Here are two shell-scripts that I use when refreshing Delphix "virtual databases" for Oracle based on two common requirements...

1. Preserve non-prod database and application account passwords across refresh
2. Preserve non-prod database link definitions across refresh


Some background: when you initially provision a database clone, you must invariably change account passwords (so that production passwords aren't exposed in non-production) and change the definition of database links (so that production databases aren't corrupted by non-prod activities). For a variety of reasons, this might be a manual process, although many folks have it automated.

Regardless, when the database clone is refreshed later, it might be necessary to repeat (and other) changes, and things can get messy. For example, after the initial cloning, DBAs might set account passwords to non-prod defaults, but developers and/or testers might change these non-prod default values for many reasons. So what is really needed is not to re-execute the same procedures performed after the initial cloning, but simply to preserve what existed prior to the refresh operation and automatically re-apply those settings after the refresh is complete.

So, the attached shell-script "ora_vdb_prerefresh.sh" is intended to be called from a Delphix "pre-refresh" hook. "Hooks" are similar programmatic callouts, similar to database triggers or "user exits". This script saves off existing database account passwords by generating a SQL*Plus script, and then it saves off database link definitions using DataPump export.

Then, the attached shell-script "ora_vdb_postrefresh.sh" is intended to be called from a Delphix "post-refresh" hook. This script checks to see if a SQL*Plus script was generated and, if so, executes it to re-apply all account passwords. Then, if a DataPump export file exists, the script calls DataPump import to re-apply the database link definitions.

You mentioned "dropping prod user and adding Test user and application schema back", so that may or may not be covered by the existing logic. Certainly, adding an entire schema back can be performed with another set of calls to DataPump?

Please note that the attached scripts have ".txt" file-extensions to avoid freaking out email filters, and of course these extensions are intended to be removed on saving.

Disclaimers: Please realize that these scripts are merely starting points or templates, not complete solutions. They work fine in my lab environment and at a couple of my customers, but they don't necessarily do exactly what you want or won't work for you. If you use them, you'll need to take ownership of them, adapt them to your environment, and there is no warranty, use at your own risk.

Hope this helps...

-Tim



On 11/16/18 08:21, Sanjay Mishra (Redacted sender smishra_97 for DMARC) wrote:

Can someone share the process used in your experience/organization where you have several multi TB database and need to frequently refresh UAT for performance testing ? I am looking not much from Masking the data which are sometimes required but based on dropping prod user and adding Test user and application schema back ? Appreciate If someone can share any script used for sync user/password which can be main challenge as other Registration with OEM/Catalog can be handled easily.



Tx

Sanjay
Mladen Gogala
2018-11-17 07:21:03 UTC
Permalink
Hi Sanjay,

As usual, there are only two methods: rman and SAN copy. I am a
consultant and I don't have a database that I can call my own. I gave up
the comfortable life of a DBA who is always on call and has to review
projects, but his advice can be overruled by the development manager who
is more worried about deadlines than performance and have sold my
SQL*Soul to consulting more than 6 years ago. I have been enjoying the
power of the dark side ever since.

When you use term "multi TB",  how much is "multi"? 10 GB Ethernet can
under ideal conditions copy/restore using rman with the speed of 3.5 TB
per hour. With some decent equipment which supports SRDF or HUR you can
be even faster. For the database nightmares with sizes of 100+ TB, you
will need some special equipment for copy over the Ethernet. This is an
example:

http://www.mellanox.com/page/products_dyn?product_family=206&mtag=connectx_4_en_ic

This is Mellanox 100 GB ethernet adapter. You will also need an
accompanying router and cables as well as the network engineer who knows
how to deal with that. And there aren't many of those.

https://www.cisco.com/c/en/us/solutions/service-provider/100-gigabit-solution/index.html

I have seen this only once and it can achieve up to 32 TB per hour. 
Quite different from the 9600 baud modem attached to VAX 4200 with which
I have started exploring network, using DECNET protocol.  Basically,
VLDB cannot function on a small machine and you really, really need
specialized hardware, either SAN or network to do that. Something tells
me that 100GB Ethernet will become the standard before I retire.

Regards

On 11/16/18 11:21 AM, Sanjay Mishra (Redacted sender smishra_97 for
Post by "Sanjay Mishra" (Redacted sender "smishra_97" for DMARC)
Can someone share the process used in your experience/organization
where you have several multi TB database and need to frequently
refresh UAT for performance testing ? I am looking not much from
Masking the data which are sometimes required but based on dropping
prod user and adding Test user and application schema back
? Appreciate If someone can share any script used for sync
user/password which can be main challenge as other Registration with
OEM/Catalog can be handled easily.
Tx
Sanjay
--
Mladen Gogala
Database Consultant
Tel: (347) 321-1217

--
http://www.freelists.org/webpage/oracle-l
"Sanjay Mishra" (Redacted sender "smishra_97" for DMARC)
2018-11-19 17:11:37 UTC
Permalink
Mladen
Thanks for your update. Our Prod where I am looking for a solution ranges from 10-55Tb. Prod and UAT are on their own Exadata cluster and so the network is not issue as using the fast connection to the Tape library.
The requirement comes is to minimize the time to least possible and so RMAN restore was taking the minimum of 4-5 hr and hence make some UAT not available. Thinking of Snapclone as another alternative test as then if I can create TestMaster and once done create the snapclone in no more than few min. Definitely, it will need much more disk space like keeping one Testmaster for existing snapclone and another copy for new Testmaster else removing existing TestMaster can remove the Snapclone and hence the UAT.

So checking two options1. RMAN restore and calculating the time for the biggest environment2. Using Snapclone process and calculating space and timing
Then might use mix approach using RMAN or snapclone based on application SLA requirement
TxSanjay On Saturday, November 17, 2018, 2:22:00 AM EST, Mladen Gogala <***@gmail.com> wrote:

Hi Sanjay,

As usual, there are only two methods: rman and SAN copy. I am a
consultant and I don't have a database that I can call my own. I gave up
the comfortable life of a DBA who is always on call and has to review
projects, but his advice can be overruled by the development manager who
is more worried about deadlines than performance and have sold my
SQL*Soul to consulting more than 6 years ago. I have been enjoying the
power of the dark side ever since.

When you use term "multi TB",  how much is "multi"? 10 GB Ethernet can
under ideal conditions copy/restore using rman with the speed of 3.5 TB
per hour. With some decent equipment which supports SRDF or HUR you can
be even faster. For the database nightmares with sizes of 100+ TB, you
will need some special equipment for copy over the Ethernet. This is an
example:

http://www.mellanox.com/page/products_dyn?product_family=206&mtag=connectx_4_en_ic

This is Mellanox 100 GB ethernet adapter. You will also need an
accompanying router and cables as well as the network engineer who knows
how to deal with that. And there aren't many of those.

https://www.cisco.com/c/en/us/solutions/service-provider/100-gigabit-solution/index.html

I have seen this only once and it can achieve up to 32 TB per hour. 
Quite different from the 9600 baud modem attached to VAX 4200 with which
I have started exploring network, using DECNET protocol.  Basically,
VLDB cannot function on a small machine and you really, really need
specialized hardware, either SAN or network to do that. Something tells
me that 100GB Ethernet will become the standard before I retire.

Regards

On 11/16/18 11:21 AM, Sanjay Mishra (Redacted sender smishra_97 for
Post by "Sanjay Mishra" (Redacted sender "smishra_97" for DMARC)
Can someone share the process used in your experience/organization
where you have several multi TB database and need to frequently
refresh UAT for performance testing ? I am looking not much from
Masking the data which are sometimes required but based on dropping
prod user and adding Test user and application schema back
? Appreciate If someone can share any script used for sync
user/password which can be main challenge as other Registration with
OEM/Catalog can be handled easily.
Tx
Sanjay
--
Mladen Gogala
Database Consultant
Tel: (347) 321-1217

--
http://www.freelists.org/webpage/oracle-l
Loading...