- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: Large amount of data loading into ORACLE datab...
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-05-2005 01:02 AM
тАО04-05-2005 01:02 AM
we have a process that loads huge amonts of data into oracle. I'm talking about millions of rows everyday. The problem is, that the tables now are huge, so we had to build different indexes so we can fetch the data, but now every inserts takes longer, because it has to rebuild indexes every time and some of the data is dropped, because the insert speed into the dataase is much smaller than the speed we get the data (from outside).
We totaly lost. We thought about breaking the tables into smaller tables, because we want the data only for 30 days, so we ould build 30 tables, table for each day, so every day insert will insert to another table. Another think is instead of inserting the data into oracle, write it in files and the, another process will load it into the database (I'm not sure it's gonna improve something).
Any ideas will be much appriciated.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-05-2005 01:10 AM
тАО04-05-2005 01:10 AM
Re: Large amount of data loading into ORACLE database
We did this years ago in another life and our load process went from 22 hours to less than 5 minutes.
We made the DB unavailable to other users and applications, but it was well worth it!
live free or die
harry d brown jr
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-05-2005 01:13 AM
тАО04-05-2005 01:13 AM
Re: Large amount of data loading into ORACLE database
it's not good enough because the data is loaded for 24/7 and we must fetch the data too ....
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-05-2005 01:19 AM
тАО04-05-2005 01:19 AM
Re: Large amount of data loading into ORACLE database
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-05-2005 01:25 AM
тАО04-05-2005 01:25 AM
Re: Large amount of data loading into ORACLE database
This requires the database to be bounced.
At the end of the upload, you can bounce the database again and turn on archive logging.
There is an obvious recovery risk in this scenario.
SEP
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-05-2005 01:27 AM
тАО04-05-2005 01:27 AM
Re: Large amount of data loading into ORACLE database
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-05-2005 02:09 AM
тАО04-05-2005 02:09 AM
Re: Large amount of data loading into ORACLE database
You could use partitioned indexes (on date columns) but that's only going to help queries - not loads, and only if the queries that you form actually query with limits by date.
Your idea of a table with live 30 days of data idea is probably the most solid I've heard (although the one with the most work).
You may want to leave the current day's table name static and roll the other day's data into another table partitioned by date. That way you've only got two tables to manage - current and old. Not one for each day. Have another job that runs every day that takes data out of the "old" table that is older than 30 days. You probably would want to put these two tables in a separate tablespace with uniform allocation sizes so as to limit fragmentation.
Another one that I've done is drop indexes before loads and re-create using parallel threads afterwards. But, since you're loading all day - that's not going to fly either.
Have you run a statspack? Are you spending a lot of time waiting for redo-log buffers? I've seen cases where increasing this number until the waits taper off makes a big difference. Also, are the redo-logs running on raid 0/1 drives? How about the data drives that you're writing to? They really should be. Are there lots of sorts on disk? You may be able to increase the sort_area_size to increase throughput (if it's way too small now, it will make a difference). Also, have you tried greatly increasing the size of the db_buffer_cache? If you can get to a point where only a small percentage of this type of I/O (the inserts) are running all the down to the drives, it could greatly increase throughput. How big is your SGA?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-05-2005 03:48 AM
тАО04-05-2005 03:48 AM
Re: Large amount of data loading into ORACLE database
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-05-2005 05:05 AM
тАО04-05-2005 05:05 AM
Re: Large amount of data loading into ORACLE database
not to sure about your application details therefore I may make some asumptions.
If you want to insert a huge number of rows into an existing table sqlldr direct mode would be the fastest way of doing it.
The constraint is that the indexes will be rebuilt at the end.
If you don't care about about logging the change you can use the UNRECOVERABLE option to speed up the operation.
if you break the table into smaller tables, would you use the partitioning option ?
If the data is not updated (other than this load) you may use the compression option (from oracle9i onward). You can also use compressed indexes.
Regards
Jean-Luc
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-05-2005 07:41 AM
тАО04-05-2005 07:41 AM
Re: Large amount of data loading into ORACLE database
I have a table with 150 Million rows, without partitioning and we load a lot of rows every day. This table has no issues in performance.
So, to help your process, there are a couple of things that we must know:
1. What version of Oracle are you using
2. How many rows do you have in your table and how many rows do you insert every day. Are this rows insert during all day or at the same moment?
3. How are your database configured: rule or choose?
4. How is the tablespace that contains the table configured: local or dictionary management?
5. Is there any row migration? What about the indexes fragmentation?