View previous topic :: View next topic |
Author |
Message |
sbarai Beginner
Joined: 22 May 2006 Posts: 4 Topics: 2
|
Posted: Mon Sep 08, 2008 5:06 am Post subject: Deleting huge amount of data |
|
|
I am trying to delete huge amount of data from DB2 table. I am able to delete data up to 30k records in a single go using delete query. When I am trying to delete more than 30k I am getting error saying that resource not available SQL code -904. I am committing after successful execution of delete query.
Please let me know how to delete huge amount of data using delete query. |
|
Back to top |
|
 |
jsharon1248 Intermediate
Joined: 08 Aug 2007 Posts: 291 Topics: 2 Location: Chicago
|
Posted: Mon Sep 08, 2008 8:04 am Post subject: |
|
|
You need to determine a reasonable unit of work and issue COMMITs. What tool are you using to delete the rows? |
|
Back to top |
|
 |
dbzTHEdinosauer Supermod
Joined: 20 Oct 2006 Posts: 1411 Topics: 26 Location: germany
|
Posted: Mon Sep 08, 2008 10:04 am Post subject: |
|
|
why don't you dump (unload) those you want to keep, drop, recreate the table and then load.
If you are using spufi (foreground or background) you need to limit the amount of rows deleted.
unload, drop, recreate, load - if you are talking about millions of rows. _________________ Dick Brenholtz
American living in Varel, Germany |
|
Back to top |
|
 |
CZerfas Intermediate
Joined: 31 Jan 2003 Posts: 211 Topics: 8
|
Posted: Tue Sep 09, 2008 4:13 am Post subject: |
|
|
You can even omitt the drop and recreate part, if your table is the only table in the tablespace (with a huge number of rows, this should be the case anyway). In this case you can load your data to be rescued with RESUME NO REPLACE.
regards
Christian |
|
Back to top |
|
 |
|
|