Operating System - Tru64 Unix
1748198 Members
2560 Online
108759 Solutions
New Discussion юеВ

Re: a Pearl script that reads through all sub directories

 
SOLVED
Go to solution
nipun_2
Regular Advisor

a Pearl script that reads through all sub directories

Hi,
I have not scripted in Pearl before. I am aware that Pearl is a great tool for dealing with files.

I need to get the list of all the files (not directories) in the current directory as well as the sub directories.

Is there a pearl script/command readily available to do this sort of thing.

what I am looking for is

execute

pearl_script

output will be a text file containing all the files in the 'Directory Name' as well as all it's sub directories.

I have NEVER touched Pearl so if you have command that does that with correct parameters. Please provide some details so that I can use it.

Any information you can provide would be helpful.

3 REPLIES 3
Steven Schweda
Honored Contributor

Re: a Pearl script that reads through all sub directories

Perl, not pearl.

Why perl?

find -type f

or, perhaps:

( cd ; find . -type f)

depending on exactly what you'd like to see.

> output will be a text file

Redirect stdout to a file.

find -type f > out_file
nipun_2
Regular Advisor

Re: a Pearl script that reads through all sub directories

Hi Steven,
Thanks for the response and the correction!

The reason for Peal is because I need to use it as a script for Matlab Program. This way I am O.S independent as Matlab can run Perl Scripts internally on Linux, Unix, Windows Platform.

Nipun
Hein van den Heuvel
Honored Contributor
Solution

Re: a Pearl script that reads through all sub directories

Hi there,

I hear you on the multiplatform thing.
I have a collection of perl scripts to help with SAP benchmarks and the exact same scripts all work on HPUX, Tru64, Linux, Windoze.

Now with perl there is always more than one way to do something.
For your question you want to read up on
- glob
- readdir
- module: File::Find;

------------------ partial find example ---
# loop through arguments which are all the tops of directory trees, looking for non-zero-length files called .err or .elg

use warnings;
use strict;
use File::Find;

my ($arg, $i);
my (%found);
while ( $arg = shift @ARGV ) {
print ++$i, " $arg\n";
chomp $arg;
foreach (glob ($arg)) {
print ++$i, " - \'$_\'\n";
find(\&wanted, $_) if -d;
}
}

sub wanted {
my ($type, $size, $file, $spec);
if (/\.(err|elg)/i && -f $_ && ($size=-s)) {
$type = $1;
$spec = $File::Find::name;
$file = $_;
:



-------------------- READDIR --------
:
opendir(DIR,$source) || &error(__LINE__,"opendir");
@entries = grep(!/^\.\.?$/,readdir(DIR));
closedir(DIR) || &error(__LINE__,"closedir");
:


------------------ simple glob --------

while ($files = shift @ARGV) {
while ($file = glob ($files)) {
:


---------------- simpler glob -----

perl -le 'print while (<*.csv>)'

Hein.