- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: Copy and compress on the fly
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-05-2005 07:33 AM
тАО05-05-2005 07:33 AM
I have read several of these forums responses about copy and compress files on the fly in an effort to find a solution to my problem. But I have yet to come across a clear response. Here is my issue. I have Oracle dbf files that are 8GB in size. Currently I do a hot backup of my database with the command "host cp /prod/index/index*.dbf /prod/work/backup"
The key here is that I use wildcards to specify more than 1 file. I then do a compress of the dbf files in the directory /prod/work/backup. Is there a way to do a copy and compress on the fly in 1 step using wildcards in the filename?
Thanks
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-05-2005 07:39 AM
тАО05-05-2005 07:39 AM
Re: Copy and compress on the fly
do
cp {} /dest/.
done
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-05-2005 07:47 AM
тАО05-05-2005 07:47 AM
Re: Copy and compress on the fly
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-05-2005 07:50 AM
тАО05-05-2005 07:50 AM
Re: Copy and compress on the fly
/etc/dbfback.dat
while read -r filename
do
cp $filename /backups/dbf
done < /etc/dbfback.dat
This lets you maintain the list of files as you add new databases. It can include wildcards as well.
SEP
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-05-2005 07:53 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-05-2005 08:18 AM
тАО05-05-2005 08:18 AM
Re: Copy and compress on the fly
cd /prod/index
for file in index*.dbf;do
dd if=./$file ibs=1024K | gzip > /prod/work/backup/${file}.gz
done
HTH.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-05-2005 08:23 AM
тАО05-05-2005 08:23 AM
Re: Copy and compress on the fly
# for longfile in $(ls /prod/index/index*.dbf)
> do
> file=$(basename $longfile)
> cat $longfile | gzip - > /prod/work/backup/$file.gz
>done
I would not use "compress" as it is usually less efficient than gzip & all those -c < switches get me most confused...
but I think a slightly better & more elegant way would be
# cd /prod/index
# tar cf - index*.dbf | gzip - > /prod/work/backup/backup.$(date %Y-$m-%d).tar.gz
of if you prefer fbackup (I do)
# cd /prod/index
# fbackup -f - -i index*.dbf | gzip - > /prod/work/backup/backup.$(date %Y-$m-%d).fbk.gz
The beauty of the above is to extract a single file you do not need to unzip move etc simply do the following...
# gzcat /prod/work/backup/backup.2005-04-01.fbk.gz | frecover -f - -i index12345.dbf
or to get the index of what is in the fbk file
# gzcat /prod/work/backup/backup.2005-04-01.fbk.gz | frecover -I /tmp/index -f -
You can do a similat thing with tar too.
Regards
Tim