- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: mtime issue
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-03-2009 12:24 PM
тАО08-03-2009 12:24 PM
I want to list the files with extension .arc older than 1 day.
i have used command
find . -type f -name "*.arc" -mtime +0 -print
it shows files,but with subfolders.
i don want list .arc files from subfolders.
I am interested in the files that are more than one day old and dont wnat to list the files from subfolders.
plz guide us with proper formats..
regards
himacs
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-03-2009 12:52 PM
тАО08-03-2009 12:52 PM
Re: mtime issue
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-03-2009 01:00 PM
тАО08-03-2009 01:00 PM
Re: mtime issue
Since you don't want to descend into directories that are subordinate to the parent you request, I think I would use Perl:
# cat ./dayoldfiles
#!/usr/bin/perl
use warnings;
use strict;
my ( $dir, $dh, $name );
$dir = $ARGV[0] ? shift : '.';
chdir $dir or die "Can't 'cd' to '$dir': $!\n";
opendir( $dh, $dir ) or die "Can't open '$dir': $!\n";
while ( $name = readdir($dh) ) {
next if $name =~ /^\.\.?$/;
next unless -f $name;
next unless ( -M $name > 1 );
printf "%s%s%s\n", $dir, $dir eq '/' ? '' : '/', $name;
}
1;
...run as:
# ./dayoldfiles /path
or for your current working directory, simply:
# ./dayoldfiles
Regards!
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-03-2009 01:07 PM
тАО08-03-2009 01:07 PM
Re: mtime issue
Oops, I missed the fact (at least by your example) that you want to find only files with the '.arc' extension. Given that, we'll add this line:
next if $name =~ m{\.arc};
...or in total:
# cat ./dayoldfiles
#!/usr/bin/perl
use warnings;
use strict;
my ( $dir, $dh, $name );
$dir = $ARGV[0] ? shift : '.';
chdir $dir or die "Can't 'cd' to '$dir': $!\n";
opendir( $dh, $dir ) or die "Can't open '$dir': $!\n";
while ( $name = readdir($dh) ) {
next if $name =~ m{^\.\.?$};
next unless -f $name;
next if $name =~ m{\.arc};
next unless ( -M $name > 1 );
printf "%s%s%s\n", $dir, $dir eq '/' ? '' : '/', $name;
}
1;
Regards!
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-03-2009 01:21 PM
тАО08-03-2009 01:21 PM
Re: mtime issue
find . -type f -name "*.arc" -mtime +0 -print | sed -n '/^.\/.*\/.*/!p'
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-03-2009 01:22 PM
тАО08-03-2009 01:22 PM
Re: mtime issue
OK, I'm an idiot. You _want_ files with the 'arc' extension, so change the line-11 in my last post:
From:
next if $name =~ m{\.arc};
To:
next unless $name =~ m{\.arc};
/* NO POINTS FOR THIS CORRECTION, PLEASE */
Regards!
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-03-2009 01:52 PM
тАО08-03-2009 01:52 PM
Re: mtime issue
GNU "find" has a "-maxdepth levels" option
which makes it easy to limit the directory
depth of the search. There's also "-prune",
but I usually find it confusing.
A Forum search for keywords like
find prune maxdepth
should find some examples.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-03-2009 02:25 PM
тАО08-03-2009 02:25 PM
Re: mtime issue
Thanx for ur time..
i will update the status soon..
regards
himacs
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-03-2009 02:28 PM
тАО08-03-2009 02:28 PM
SolutionWhat's wrong with the solutions offered? You apparently don't want a Perl script and you apparently don't want to investigate the GNU find() [with its enhanced options].
Do you want a solution to your problem or do you want only a _specific_ tool to solve it? Does a screwdriver suffice to drive nails?
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-03-2009 02:48 PM
тАО08-03-2009 02:48 PM
Re: mtime issue
Its nothing like that i dont want to investigate. In fact my problem was solved by using the one line command
find /data04/pcard_arch/*.arc -type f -name "*.arc" -mtime +0 -print | sed -n '/^.\/.*\/.*/!p'
And about the perl script,m not familiar with scripting.now m checking your script before executing,since mine live server.
Thanks for your wonderful and timely support.I expect the same in future too...Expect more posts from me.