We backed up one of our directories last night with a total of 40GB, even though it's supposed to be around 80GB. A du -sk on that directory now returns different results from 20GB up to 50GB. It's basically continuously changing. What could that be?
The most likely answer is that you have some process running that is continually making and removing scratch files. I would run lsof or fuser on that filesystem and see what processes are doing what.
Temporary files are a likely cause of the size dynamics. Is/was the filesystem very active? The characteristics of the data in the files (e.g. sparse data), the backup method used (e.g. 'fbackup') and tape compresssion will all influence the amount of backup (tape) used, too. You need to provide more specific information to make a more definitive answer.
You haven't yet supplied enough data. Yes, it could be a rogue daemon BUT it could have been a stupid human mistake or simply a smart human move to remove a no longer needed file/directory.
Some applications create a temporary file and immediately unlink() it. This leaves the file available for the process to use, but no others. *You* don't see a file in the directory but space is consumed (used) until the owning process closes the file.
This is very likely the cause of your disk utilization dynamics.
Nothing should have changed. We didn't delete anything, activity is the same. We didn't change our backup method either and it shows a difference of about 40GB and 25000 files in comparison to the backup on Wednesday night.