Installing a Network Shared HP LaserJet 4050 in Windows 7 x64

This is fairly specific but was difficult to find a fix for. The HP universal print driver cannot be used when installing a LaserJet 4050 that is not attached directly to your PC. The problem is down to the plug-n-play ID for the printer not being found and the driver not installing correctly. Thanks to the guys on the HP Support Forums I found a solution.

Initially I tried just installing the universal printer driver and adding a network printer (with the correct machine and printer name) but it couldn’t find the driver. I even manually selected the download folder for the driver “C:\HP Universal Print Driver\pcl5-x64-″ but it wouldn’t have it.

Next I tried adding a local printer on LPT1 (with no physical printer present) and using the same driver. This didn’t work.

The solution is to add a local printer using LPT1 then click the “Windows Update” button, wait a few minutes then select Manufacturer “HP” and Printers “HP LaserJet 4050 Series PCL 5″. This will install a working driver, which is then used automatically when you add a network printer.

Quickly and Safely Move a Microsoft SQL Server Database (MDF and LDF Files) to a New Physical Location (Including Setting Read Write Access)

To move a Microsoft SQL Server database to a new physical location you need to detach, copy, reattach and set permissions for the database MDF and log LDF files associated with the database. I needed to do it as the drive on which each type of database required file (MDF/LDF) was located was to be separately backed up, as per the site backup regulations. I was using SQL Server 2008 R2.

Microsoft’s recommended way of doing this is to use SQL Management Studio. Create a query and detach your database “mydb” by entering and running the following:

use master
sp_detach_db ‘mydb’

Now copy the MDF and LDF files to their new location and reattach (locations and file names are for this example only):

use master
sp_attach_db ‘mydb’,’E:\DATA\mydb.mdf’,’F:\DATA\mydb_log.ldf’

You can check the basic properties of the database (and that the file locations have been correctly set) using:

use mydb

The final thing I needed to do was to set the file/folder permissions so that the database could go from read-only to read/write. I first set the folder permissions for my two new folders (“E:\DATA\” and “F:\DATA\” as above). To do this I needed to add the following user to the security settings with full control:


This didn’t actually set the files (server permissions error) so I set the permissions for the “E:\DATA\mydb.mdf” and “F:\DATA\mydb_log.ldf” files individually the same way.

Now open up another query window and type the following to enable read/write access to the database:

use master
alter database mydb set read_write with no_wait

Now your database is set up exactly as it was previously, only the associated files have moved physical location.

“unblock” an entire directory of files (rather than individually) when copying files between NTFS locations in Windows Server

We copied a few hundred files between a Windows Server 2003 machine and a Windows Server 2008 machine in order to migrate a ASP.NET web application. There are a few hoops to jump through as once it was set up I instantly hit:

Request for the permission of type ‘System.Web.AspNetHostingPermission, System, Version=, Culture=neutral, PublicKeyToken=b27b5d561934e089′ failed. (C:\inetpub\wwwroot\WebApp\web.config line 90)

This was related to a “<httpModules>” element in the “Web.config” file.

On searching for similar problems I found a blog post on MSDN which stated that this could be due to a DLL file that needed to be unblocked (right click, “Unblock” button) after copying from another location. Unfortunately, we couldn’t go through and unblock each individual file to try and get round this as there were so many.

The solution (thanks to superuser/StackExchange) is to zip up all the files into one compressed archive before transferring them. Then the NTFS flagging of “unsafe” files that need to be unblocked only includes the one file, simple!

This solved our System.Web.AspNetHostingPermission error straight away, implying that the blocking of files by NTFS for security reasons can affect ASP.NET migration.

Run .bat batch and .cmd files as scheduled tasks in Windows with a local user (avoid the “Could not start” error)

Running scheduled tasks as a local user means you can lock down user permissions and avoid giving broad admin rights to your local users. I have a scheduled task that needed to be run by a local user by running a .cmd (.bat works as well) batch file every day.

I created the local user with a password and added the scheduled task to run my .cmd file every day at 4am. When adding the scheduled task I put in the correct user details and password and then tried to run it, which failed with a “could not start” error.

The reason for this is that by default, new local users do not have read and execute permissions on “cmd.exe” which is used by Windows task scheduler to start .cmd and .bat files in scheduled tasks. The fix is to navigate to your “system32″ directory (probably “c:\windows\system32″) and right click on the “cmd.exe” application, go to the security tab and add your new local user with “Read & Execute” permissions.

Once the security settings for “cmd.exe” are set to allow your local user to run it, the task scheduler will now allow your .cmd/.bat scheduled task to run with that local user and everything will work fine.

Quickly add large numbers of users to a computer or server using addusers.exe

addusers.exe is a great tool that is included in Windows 2000 Resource Kit and is described by Microsoft as “32-bit administrative utility that uses a comma-delimited text file to create, modify, and delete user accounts”. The main benefit is using something like Excel or notepad to create a list of users, passwords and other details and then instantly generating their user accounts. We had to do this recently for almost 100 user accounts and it took seconds once we had written the text file. Although this isn’t officially supported on Windows 2003 or later, it still worked for us.

You can download addusers.exe direct from Microsoft’s FTP server or from

The command we used to import a list of users contained in “listofusers.txt” to our server “servername” using addusers.exe was:

addusers /C C:listofusers.txt \servername /P:LE

Where the text file listofusers.txt looked something like:

username1,User Fullname,p4ssword,
username2,User2 Fullname,p4ssword2,

You can do a lot more with addusers.exe but this is a very handy and quick way of managing the creation of large numbers of users.

Easily migrate content between SharePoint servers using stsadm.exe and fix the HTTP 401.1 site collection administrator local login problem

I needed to migrate content from one install of SharePoint Web Services 3.0 on Windows 2003 to another physical server and ran into an issue with checking my site locally before changing all the DNS records over. The actual export and import process is quite easy but can take a while if there are a lot of subsites or files within your SharePoint portal.

Migrating between SharePoint installs using stsadm.exe

The easiest way to export an entire SharePoint site is to use stsadm.exe, which is typically located in “C:Program FilesCommon FilesMicrosoft Sharedweb server extensions12BIN” and is available when you have installed SharePoint. I created a reference to this in the PATH variable so I could use it everywhere to make things easier. Thankfully the documentation is good (thanks Microsoft) and you can find more explicit details on exporting and importing from TechNet. Note that this is similar to other versions of SharePoint and more information on various methods of migration can be found on Microsoft Support, including moving databases directly. Chris O’Brian also has a good post about the various approaches.

The simple stsadm command I used to export my portal (including all files and subsites etc) at was:

stsadm -o export -url -filename sitename.bak

This produced a lot of ~30mb files and a good log of everything that took place. All the Active Directory user permissions were also included in the export, which was one of my big worries moving to a new server. For local users you have more of a problem as these don’t exist on the new server and need to be recreated.

To import your site back into another SharePoint install, you have to first make sure there is an existing web application and associated site collection (on the root “/”) before copying over all your exported files to the new server. The first time I tried I assumed it would regenerate the site collection based on my export from scratch, but apparently not. The command I used to import was:

stsadm -o import -url -filename sitename.bak

Now even though you will probably have a lot of files from the export process, you do not need to specify them all, just the main one, in this case “sitename.bak”. After a while your new site will be populated with all the content from your export and is ready for testing before you go live and change your DNS records to point to the new server.

Testing your newly migrated site locally, avoiding HTTP 401.1

As I was using remote desktop to access my new server to run the stsadm.exe import command I wanted to test the site locally by logging in with my site collection administrator details before changing the DNS over. To do this I set up a reference in my hosts file “C:Windowssystem32driversetchosts” on the new server to point to localhost ( then tried to visit within my remote desktop session. This is where I hit a HTTP 401.1 login error due to a security setting built into Windows 2003, even though I tried logging in with the correct site collection administrator details.

This is apparently a security fix to Windows Server 2003 SP1 that stops reflection attacks and according to Microsoft “authentication fails if the FQDN or the custom host header that you use does not match the local computer name”. The details on how to fix this are located at Microsoft Support and I’ve noted the easiest way to fix this by removing the loopback check entirely.

First you need to disable strict name checking by editing the registry on your server. Open the registry editor (run regedit.exe) and go to “HKEY_LOCAL_MACHINESystemCurrentControlSetServicesLanmanServerParameters”. Now click “Edit -> New -> DWORD Value” and name it “DisableStrictNameChecking” and then right click and set it to decimal “1″.

Now go to “HKEY_LOCAL_MACHINESYSTEMCurrentControlSetControlLsa” and click “Edit -> New -> DWORD Value” and name it “DisableLoopbackCheck” and then right click and set it to decimal “1″ as well.

You need to restart your server for the changes to have any effect and once you have you should be able to log in to your local site at without hitting a HTTP 401.1 error with your site collection administrator details. Now you can test out the site before changing the DNS records to point to the new server and removing your host file record.

Easily connect and use PHP with SharePoint lists using cURL, RSS and NTLM authentication

Connecting to SharePoint from PHP is actually not that difficult if you have the cURL extension installed on your web server. In the case of my XAMMP windows development server I just made sure the following line in php.ini (c:xammpphpphp.ini in my case) was uncommented before restarting Apache:


In Ubuntu/Linux you can usually just install the packages for cURL and after restarting Apache it will become available. Just type the following on the command line:

sudo apt-get install curl libcurl3 libcurl3-dev php5-curl

Then restart Apache

sudo /etc/init.d/apache2 restart

Now the following code comes from both Norbert Krupa’s comment on David’s IT Blog and a question about parsing HTML on StackOverflow. The important thing to note is that I needed to use cURL to authenticate my domain user when connecting to my secure SharePoint Services 3.0 test site. Apparently you can get away without using cURL on sites that don’t need authentication but the same cURL code listed below can be used with a blank username and password for the same effect.

The goal of this listing is to connect to SharePoint using a domain user (can also be a local user if SharePoint is set up that way) and retrieve the contents of a SharePoint list. The trick is to supply the RSS feed url, which allows PHP to parse the RSS feed and neatly list the contents of a SharePoint list. An advantage of using RSS feeds of SharePoint lists is that they are secured using the same method as the list itself and require no extra configuration on the SharePoint side of things. You can also set the RSS feed to only show a set number of items or days, which is useful for regularly updated lists.

// generic function to get the contents of an HTML block
function get_inner_html( $node ) {
    $innerHTML= '';
    $children = $node->childNodes;
    foreach ($children as $child) {
        $innerHTML .= $child->ownerDocument->saveXML( $child );
    return $innerHTML;

// username and password to use
$pwd = 'PASSWORD';
// URL to fetch, this is the address of the RSS feed (go into a list and click "Actions" -> "View RSS Feed" to get the url)
$url = "";
//Initialize a cURL session
$curl = curl_init();
//Return the transfer as a string of the return value of curl_exec() instead of outputting it out directly
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
//Set URL to fetch
curl_setopt($curl, CURLOPT_URL, $url);
//Force HTTP version 1.1
//Use NTLM for HTTP authentication
//Username:password to use for the connection
curl_setopt($curl, CURLOPT_USERPWD, $usr . ':' . $pwd);
//Stop cURL from verifying the peer’s certification
curl_setopt($curl, CURLOPT_SSL_VERIFYPEER, false);
//Execute cURL session
$result = curl_exec($curl);
//Close cURL session

$xml = simplexml_load_string($result);

// display results on screen
foreach($xml->channel->item as $Item){
    echo "<br/>($Item->title)";
    $doc = new DOMDocument();
    $ellies = $doc->getElementsByTagName('div');
    foreach ($ellies as $one_el) {
        if ($ih = get_inner_html($one_el))
            echo ", $ih";

The SharePoint RSS feed is a little interesting as the “$Item->title” object is the main column in the list but the rest of the list is encapsulated in <div> within “$Item->description”, hence the requirement to parse the html.

For a SharePoint list with 3 columns the output will look something like:

(Item 1 Title) , Other Column A: xxxx, Other Column B: yyyy
(Item 2 Title) , Other Column A: zzzz, Other Column B: kkkk

Now the potential for this is great as it allows us to securely synchronise SharePoint lists with external databases, use SharePoint for authenticating PHP applications etc . We are going to be using this for automatically pulling users from a SharePoint list to populate a separate PHP application, whilst keeping user-identifiable data locked away on SharePoint.

Reduce the size of the winsxs directory in Windows 7 using DISM

Looking for ways to shrink the massive Windows 7 install? The winsxs folder was 6.7GB on my tiny 60GB SSD, way too big for something that seems to just hold service pack backups. Microsoft are AWFUL at explaining how to deal with this, numerous searches always come up with “Don’t touch it, it’ll break Windows!”. Thanks to some helpful people on Technet the solution is to use the Deployment Image Servicing and Management Tool (DISM). Just type the following in a command prompt:

dism /online /cleanup-image /spsuperseded

Instantly reduced my winsxs folder from 6.7GB to 4.5GB, which is a fair chunk of space on an SSD.

Send email alerts for low disk space using perfmon.exe in Windows using bmail.exe

I needed to set up email alerts for one of the servers so when disc space dropped below a threshold it would email the team to deal with the issue. I know you can use full monitoring packages for this kind of thing (like Nagios etc) but in this case that kind of monitoring is overkill. I used the bmail.exe command line SMTP mailer from but you can use any command line mailer you want really.

First I created a simple .cmd script to send the warning email, making sure to include %1 so as to include any command line arguments produced by perfmon.exe. Apparently if you don’t have any commmand line arguments when you set up alerts the script will not be called:

c:\emailfromcmdbmail.exe -s mail.server.ip -t -f email@addresstosendfrom -h -a “Disk Space Warning on” -b “Less than 100GB free on Please free space to ensure weekly backups can continue. Free: %1 MB”

The direct path to the executable must be explicitly stated, otherwise it will not be called and no error/warning will be shown.

Now open perfmon.exe, expand “Performance Logs and Alerts” – “Alerts”. Right click and select “New Alert Settings…” and choose a name. The comment can be anything.

Now add a counter. I used “LogicalDisk” – “Free Megabytes” – “C:” to select free space on the C drive. Set “Alert when the value is:” to “Under” and the “Limit:” to whatever free space warning you need. I set the sample interval to 3 hours. Now select run as to be an administrator on the machine and set the password.

Click the action tab and select “Run this program:” and put in the path to your .cmd file, in my case “C:emailfromcmdemailteam.cmd”. Select “Measured value” in “Command Line Arguments…”. Now click OK, you may need to enter the password for the user with permission to run your .cmd.

Done, now your server will email you when drive space gets low.

Activating Windows 7 Enterprise (MSDN, Bizspark etc)

Thanks to Paul Kiddie. Windows 7 expects a license server on the local network which isn’t present when you install Windows 7 Enterprise on a home machine. As a result you can’t activate Windows (fails with error 0×8007232B). You need to install the key using:

slmgr -ipk xxxxx-xxxxx-xxxxx-xxxxx-xxxxx (put your key here)

After this is successful you can activate windows like normal.