Check for large directories in linux (and check Apache logging!)

We had a problem where a server wasn’t allowing us to upload any more files using our web application’s interface. This was due to an enormous “error.log.1.txt” in “/var/log/apache2/” caused by setting our log level to warnings rather than errors. Thanks to Josh at I could run a command and quickly find directories over 1GB in size:

du -h / | grep ^[0-9.]*G

This quickly showed up our 12GB log file..

Allow larger file downloads than 50mb from SharePoint (fix error 0x800700DF)

I ran into a 0x800700DF error when trying to download a 100mb file from SharePoint using Windows Explorer. I tend to do all my file operations in SharePoint with the WebDAV interface by pointing Windows Explorer to \portal.server.nameportal and logging in with an account with permissions to access the portal. The 0x800700DF error when attempting to copy large files is actually described as:

Error 0x800700DF: The file size exceeds the limit allowed and cannot be saved.

The fix involves changing a registry setting on your client machine to allow 4GB downloads (the maximum possible). From Microsoft Answers:

FileSizeLimitInBytes is set to 5000000 which limits your download so just set it to maximum! (this is client side btw on windows 7)


  • Right click on the FileSizeLimitInBytes and click Modify
  • Click on Decimal
  • In the Value data box, type 4294967295, and then click OK. Note this sets the maximum you can download from the Webdav to 4 gig at one time, I haven’t figured out how to make it unlimited so if you want to download more you need to split it up.