- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: Truncating a large file into smaller files
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2003 09:01 PM
02-22-2003 09:01 PM
Truncating a large file into smaller files
I have a file, and not a filesystem which is consuming 1GB in size. Is there a command in UNIX which allows me to "break" this 1GB of file into other smaller portions files?
Or if a script is needed, could anyone kindly show me how it's to be done?
Could someone kindly help me out?
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2003 09:10 PM
02-22-2003 09:10 PM
Re: Truncating a large file into smaller files
I guess split is the best and should solve your purpose.
Rajeev
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2003 09:40 PM
02-22-2003 09:40 PM
Re: Truncating a large file into smaller files
Manoj Srivastava
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2003 10:02 PM
02-22-2003 10:02 PM
Re: Truncating a large file into smaller files
Could you provide some examples on the usage of split?
As for head and tail commands, how do I use them for splitting a file into smaller files?
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2003 10:29 PM
02-22-2003 10:29 PM
Re: Truncating a large file into smaller files
split will break a large file into smaller pieces. The number of files can be very large (over 600 separate files with defaults). First: how many lines are in the file? Use the wc command to report on line as in:
wc -l /some_bigfile
The default is 1000 lines per file so the default can split a 650,000 line file into 650 separate files, each 1,000 lines long:
split /some_bigfile
If you need less files then specify a larger line count as in:
split -l 10000 /some_bigfile
and you'll get about 65 large files. split has a maximum line length (see the man page)
tail and head will be very cumbersome as tail has a 20Kb internal buffer that cannot be changed.
Bill Hassell, sysadmin
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-23-2003 08:16 AM
02-23-2003 08:16 AM
Re: Truncating a large file into smaller files
NAME
csplit - context split
SYNOPSIS
csplit [-s] [-k] [-f prefix] [-n number] file arg1 [...argn]
DESCRIPTION
csplit reads file, separates it into n+1 sections as defined by the
arguments arg1 ... argn, and places the results in separate files.
The maximum number of arguments (arg1 through argn) allowed is 99
unless the -n number option is used to allow for more output file
names. If the -f prefix option is specified, the resulting filenames
are prefix00 through prefixNN where NN is the two-digit value of n
using a leading zero if n is less than 10. If the -f prefix option is
not specified, the default filenames xx00 through xxNN are used.
Col.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-23-2003 11:28 AM
02-23-2003 11:28 AM
Re: Truncating a large file into smaller files
If this file is a binary file, aka not an ascii text file, then you will have "issues" using split. If you have an application that needs to read and write to this file, then you probably can't break it up into smaller pieces. And if you are trying to break it up into smaller pieces to manage where/how the disk space will be used, then its a lot easier to just manage one filesystem.
live free or die
harry
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-24-2003 01:41 AM
02-24-2003 01:41 AM
Re: Truncating a large file into smaller files
There is a lot of good info here but I think that you will have a place problem splitting it.
Until the csplit or what you like is finished you will nedd 2x the place or even slightly more for old new and work files.
What is the file.
do file filename
here is a silly script to split a larger file into 60 line pieces to print.
file=$1
let length=`cat $file|wc -l`
let start=1
let x=0
q=""
while [ "$start" -le "$length" ]
do
let x=$x+1
tail -n +$start $file|head -n 60 > /tmp/printfax$x
let start=$start+60
q=$q" "/tmp/printfax$x
done
lp -dbelgo161 $q
/bin/rm /tmp/printfax*
Steve Steel