- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: How to back up the files larger than 2G.
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-01-2003 06:45 PM
тАО06-01-2003 06:45 PM
How to back up the files larger than 2G.
I have a cron job to backup my database file completely every day.But as the my database file increase, the size have larger than 2GB now. so how I can't backup my database with tar commmand. The following script is the database backup sh:
#Script for backup database
echo "********backup start ****************"
date
/arrUsr/qad/qad73/stop.act72c01
cd /arrUsr/pddb
tar cv ./act72c01.*
cd /arrayDs/pddb
tar rv ./act72c01.*
tar rv ./mfghelp.*
date
/arrUsr/qad/qad73/start.act72c01
/arrUsr/qad/qad73/stop.Nsds
cd /arrayDs/pddb
tar rv ./nsds.*
/arrUsr/qad/qad73/start.Nsds
cd /arrUsr/qad
tar rv ./qad73
cd /arrayDs
tar rv ./users
date
echo "*****backup end ***************************************"
And the filesystem of dabase file is the large filesystem already. My question is how to change my backup sh that backup the file larger than 2GB ? who can help me?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-01-2003 07:23 PM
тАО06-01-2003 07:23 PM
Re: How to back up the files larger than 2G.
If you are dealing with HP servers, I would suggest you to use 'fbackup/frecover' to support >2GB handling.
You may change tar by using:
fbackup -f /dev/rmt/0m -i /data -e /data/log
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-01-2003 07:26 PM
тАО06-01-2003 07:26 PM
Re: How to back up the files larger than 2G.
Download the GNU version which can support files > 2GB
http://hpux.cs.utah.edu/hppd/hpux/Gnu/tar-1.13.25/
if you are using 11.11 get this patch
PHCO_26423
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-01-2003 07:34 PM
тАО06-01-2003 07:34 PM
Re: How to back up the files larger than 2G.
fbackup will allow you to specify a graph file which can contain specific directories or files to backup.
# man fbackup
for more information.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-01-2003 08:31 PM
тАО06-01-2003 08:31 PM
Re: How to back up the files larger than 2G.
i would prefer to do the changes at the Oracle database level and split the backup files themselves.
Anyway, two possible solutions are:
Solution 1:
===========
Export to a PIPE and have compress and split read the pipe. The result is a couple of 500meg compressed files that consistute the export.
We use the CSH script below for a full export and then test the integrity of the export by doing a full import show = y.
This gives a file with all of my source code and ddl.
#!/bin/csh -vx
setenv UID /
setenv FN exp.`date +%j_%Y`.dmp
setenv PIPE /tmp/exp_tmp_ora817.dmp
setenv MAXSIZE 500m
setenv EXPORT_WHAT "full=y COMPRESS=n"
echo $FN
cd /nfs/ORA-BACKUP/export_mydb_ora817
ls -l
rm expbkup.log export.test exp.*.dmp* $PIPE
mknod $PIPE p
date > expbkup.log
( gzip < $PIPE ) | split -b $MAXSIZE - $FN. &
#split -b $MAXSIZE $PIPE $FN. &
exp userid=$UID buffer=20000000 file=$PIPE $EXPORT_WHAT >>& expbkup.log
date >> expbkup.log
date > export.test
cat `echo $FN.* | sort` | gunzip > $PIPE &
#cat `echo $FN.* | sort` > $PIPE &
imp userid=sys/change_on_install file=$PIPE show=y full=y >>& export.test
date >> export.test
tail expbkup.log
tail export.test
ls -l
rm -f $PIPE
Solution 2:
===========
use
exp files=(file1.dmp,file2.dmp,file3.dmp,....) filesize=N
and export will create many dmp files, each N bytes in size
Hopefully, this is not your primary Oracle Database backup strategy but RMAN.
Hope this helps!
regards
Yogeeraj
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-02-2003 11:48 AM
тАО06-02-2003 11:48 AM
Re: How to back up the files larger than 2G.
Keep the scripts you are currently using, but buy Microlite's backupedge. I think is less than $500.00 and it will save you much grief in the long run. If you had ever ran into a failed checksum, it can read through it. This is a great product. Some corrupted archives, I have recovered 99% + of a corrupted database and was able to rebuild the indexes to return the system to service.
Substitute edge for tar in the working scripts and you are in business.
Tim
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-02-2003 12:36 PM
тАО06-02-2003 12:36 PM
Re: How to back up the files larger than 2G.
That being said, you should at least look into compiling GNU tar. You can't beat the price. It'll handle files larger than 2GB, and smaller than 16GB. However, if your DBA's are like my DBA's, they set their .dbf files to 33GB, and almost every utility is useless. We use Netbackup here, which splits the files up into 2GB segments, then tar's them.
Chris
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-02-2003 01:10 PM
тАО06-02-2003 01:10 PM
Re: How to back up the files larger than 2G.
As mentioned, fbackup can handle any filesize, has error recovery (for tape problems), and using a script, can manipulate a tape changer. However, you should establish the value of the data you are saving. Is it worth one million dollars? Then using a cheap backup program is like a cheap insurance policy--it may not work when it is really needed.
Another commercial grade backup tool is HiComp's HiBack program with the added advantage that it handles other flavors of Unix, the HP 3000 and PC operating systems.
Bill Hassell, sysadmin
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-02-2003 05:55 PM
тАО06-02-2003 05:55 PM
Re: How to back up the files larger than 2G.
http://www2.itrc.hp.com/service/cki/patchDocDisplay.do?patchId=PHCO_26423
Rgds...Geoff
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-03-2003 05:16 PM
тАО06-03-2003 05:16 PM
Re: How to back up the files larger than 2G.
but if you pipe the stdout of the tar command to gzip, your resulting tar.gz file will be less than 2GB?