1834926 Members
2793 Online
110071 Solutions
New Discussion

Re: Unix Question

 
SOLVED
Go to solution
brian_31
Super Advisor

Unix Question

Team:

I have some files containing "null characters" (Octal display show as \0). The questions
that I have is:

1. Why are they caused?
2. How can I delete them?

Thanks

Brian.

6 REPLIES 6
A. Clay Stephenson
Acclaimed Contributor

Re: Unix Question

The ASCII NULL is most commonly used as a flag to indicate the end of a string. For example, a string variable might hold up to 80 characters but the presence of a NUL before that indicated the actual end of the string. Any characters (up to the maximum length, 80 in this case, past the NUL are ignored. NUL could also be part (or all) od a zero value; for example 0 0 0 0 would be a 32-bit integer zero value; 8 NUL's could be a double-precision floting point zero value. Unless you know something about the nature of these files, don't remove/replace the NUL's.

Here is a technique to remove the NUL's:

tr -d "\000" < infile > outfile

As an example, it would be state of the art stupid to apply the above command to /etc/lvmtab which typically contains a number of NUL's (all normal).
If it ain't broke, I can fix that.
Rodney Hills
Honored Contributor

Re: Unix Question

Are these files you created using an editor, or an application you ran, or they've been their for a long time?

What is the full directory path?
Are these text files or binary files?

Why are you concerned about the null characters?

A little more detail is required...

-- Rod Hills
There be dragons...
brian_31
Super Advisor

Re: Unix Question

Hi Clay:

Could you explain what the command does pl.

Thanks

Brian
Hein van den Heuvel
Honored Contributor

Re: Unix Question

>> Hi Clay:
>> Could you explain what the command does pl.

He could, but I'm sure he'd prefer you would just hit the man page for 'tr': man tr


[tr 'translates' one character to an other. the -d option alters that to a deleted instead of translate. ]

fwiw,
Hein.


A. Clay Stephenson
Acclaimed Contributor
Solution

Re: Unix Question

Play with it and see after doing a man tr. It's a safe command to remove all the NUL's from a file because it does not alter the input file (stdin). Tr acts as a filter so run the command and look at the resulting stdout file; the NUL's should be gone --- whether they should be or not. Only when you copy the output file to the original have you done (possible) damage.

If it ain't broke, I can fix that.
John Poff
Honored Contributor

Re: Unix Question

Hi Brian,

It depends on what application is writing the files, but it sounds like it might be a bug. What type of files are they, and what are some of the filenames?

One way that you can delete them is to use the find command. Something like this might work:

find . -name "*somestring*" -exec rm {} \;

Where "*somestring*" is some string pattern of non-null characters that matches the filename.

I would try the 'find' without the rm first, just to be safe and to make certain that it just finds the files you are interested in:

find . -name "*somestring*"


If that doesn't work, you could try doing it by inode number. Do an 'ls -li' to see the inode number of all the files. Look for the files with the bad filenames, and then pass the inode number to the find command:

find . -inum 1234

JP