Tuesday, 15 January 2019

Why is so much data being written into DBQL?

 

There are thousands of queries submitted per minute and it is too much data.
But, To avoid the performance impact on the database due to these logged queries,We need to be pro-active and choosy about the data we want to log.
So we save the space, resources,performance and etc etc.
Note : DBQL can only be improvised but can’t be ignored.Its a pillar for Admins as well,helps in disasters ðŸ˜€
But,if space and other issues are concerened, below tips can help you some:

How to speed up Teradata Fastexport

About Teradata Fastexport
Teradata Fastexport is a nice, powerful tool to export mass volume of data to file. Its default execution flow is the following:
1.   Apply locks on the affected tables
2.   Execute the SELECT statement
3.   Places the result in a SPOOL
4.   Releases locks
5.   Exports data from the SPOOL