- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - OpenVMS
- >
- Long text file truncate
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО12-13-2007 09:26 AM
тАО12-13-2007 09:26 AM
Is there any way to easily copy only this 100000 lines to another file?
(Editing the file with TPU and trying to delete lines from 100000 onwards displays the message "Too many records")
Thank you
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО12-13-2007 09:49 AM
тАО12-13-2007 09:49 AM
Re: Long text file truncate
open the input file and the output file,
read a line and write it N times, and stop?
I assume that this could be done in C, DCL,
Fortran, or practically anything else you
have.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО12-13-2007 09:51 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО12-13-2007 09:57 AM
тАО12-13-2007 09:57 AM
Re: Long text file truncate
try this:
$EDT/EDT/NOCO big-file
*WRITE filename 1:100000
*quit
This should write the first 100000 lines from file big-file to filename
Volker.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО12-13-2007 10:03 AM
тАО12-13-2007 10:03 AM
Re: Long text file truncate
http://vms.process.com/scripts/fileserv/fileserv.com?EXTRACT
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО12-13-2007 10:39 AM
тАО12-13-2007 10:39 AM
Re: Long text file truncate
$search my_big_file.txt "something"/match=eqv/limit=100000/output=my_small_file.txt
Bill
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО12-13-2007 11:00 AM
тАО12-13-2007 11:00 AM
Re: Long text file truncate
$ perl -pe "last if $. > 99999" big.dat > small.dat
$ gawk /out=small.dat "(NR > 99999){exit} {print $0}" big.dat
Hein.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО12-14-2007 06:43 AM
тАО12-14-2007 06:43 AM
Re: Long text file truncate
set file/attr=EBK:15600 file
set file/truncate file
if the the file has average lines of 80 characters.
(15600 = 100000/512*80)
Delay the /TRUNC until You have verified the new size is sufficient.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО12-14-2007 09:05 AM
тАО12-14-2007 09:05 AM
Re: Long text file truncate
Or even...
$ gawk /out=small.dat "(NR > 99999){exit} 1" big.dat
Joseph>If the purpose is to get rid of the space occupied by the 14 GB garbage, I would be brave and do:
>> set file/attr=EBK:15600 file
>> set file/truncate file
Right. But you _should_ try to also set the FFB, allthough it will work regardless.
>> if the the file has average lines of 80 characters.
(15600 = 100000/512*80)
That would be a great, IO free, first cut and likely good enough.
If you want to do it exactly then you can use DUMP/RECORD=(COUNT=1,START=100001) and use the RFA for the exact byte offset.
$ pipe dum/re=(co=1,st=10) tmp.tmp | perl -ne "if (/^Re.*A\((\d+),(\d+),(\d+)/) {printf qq(set file/att=(ebk:%d,ffb:%d)
),$1+0x1000*$2,$3}"
This will output something like:
set file/att=(ebk:1,ffb:18)
You can then apply that to the file (after first saving:
f$file("tmp.tmp","EOF")
f$file("tmp.tmp","FFB")
fwiw,
Hein.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО12-14-2007 12:00 PM
тАО12-14-2007 12:00 PM
Re: Long text file truncate
on the other hand, if a runaway process is filling my disk with a 15GB logfile, I probably won't care to cut a record somewhere after the 100000th :-)