Easily set up and automatically start Apache Tomcat 7 Java web server in Ubuntu Linux

Apache Tomcat is actually easier than the standard Apache webserver to set up, which is great news if you are working with Java based web applications. All you need to do is download it and make sure it starts with whichever linux distribution you are using. Deploying applications in standard WAR format is really easy as well due to the simple web based management interface.

In my case I wanted Tomcat to start with Ubuntu and sit on the default port 8080 so I could have it running alongside my standard Apache webserver for PHP. We were developing a Spring application and used Maven to build and compile to a single deployable WAR file. You must have Java installed and set up for this to work. To check you have Java set up type:

java -version

This should tell you what version of java you have installed (hopefully Java 1.7). You also need to check that the “JAVA_HOME” variable is set by typing:


If you don’t get something like “/usr/lib/jvm/jdk1.7.0_09″ please install Java following my installation instructions in a previous post.

To install Apache Tomcat first of all I downloaded the latest copy of Tomcat 7 from mirrorservice.org using wget run from my home directory:

wget http://www.mirrorservice.org/sites/ftp.apache.org/tomcat/tomcat-7/v7.0.32/bin/apache-tomcat-7.0.32.tar.gz

Please note that the version I downloaded may not be available or there may be a newer version so check http://www.mirrorservice.org/sites/ftp.apache.org/tomcat/tomcat-7/ first before running the wget.

“wget” will download the file, which then needs to be extracted:

tar xvzf apache-tomcat-7.0.32.tar.gz

Now you will have a folder “apache-tomcat-7.0.32″ in your home directory. This needs placing somewhere sensible so copy it to “/usr/share/tomcat7″ using:

sudo mv apache-tomcat-7.0.32/ /usr/share/tomcat7

Now you can test your Tomcat install works with its default settings by starting it up. Note: before you do this you need to set the “JAVA_HOME” variable otherwise you will get errors (see my previous post).

To start up Tomcat navigate to “/usr/share/tomcat7″ and run “startup.sh”:

cd /usr/share/tomcat7


With the default settings you should now be able to reach your Tomcat server home page by navigating to “http://your.ip.add.ress:8080″ where you should hopefully see the homepage and a nice message saying:

“If you’re seeing this, you’ve successfully installed Tomcat. Congratulations!”

Now we need to set up management users for the manager app so we can easily deploy our WAR files containing our Java web applications. You need to edit “/usr/share/tomcat7/conf/tomcat-users.xml”:

sudo nano /usr/share/tomcat7/conf/tomcat-users.xml

Now add the following lines within the “<tomcat-users>” block to give access to the manager GUI:

<role rolename=”manager-gui”/>
<user username=”MANAGERUSER” password=”YOURPASSWORD” roles=”manager-gui”/>

Now you will be able to log in to the manager GUI at “http://your.ip.add.ress:8080/manager/html” using the login details MANAGERUSER and password YOURPASSWORD. You can deploy applications and generally manage your Tomcat install from here.

The final thing to do is to set up Tomcat so that it starts every time your server starts. This is pretty easy as all you need to do in Ubuntu is edit the “/etc/init.d/tomcat7″ file:

sudo nano /etc/init.d/tomcat7

Now enter the following lines:

# Tomcat auto-start
# description: Auto-starts tomcat
# processname: tomcat
# pidfile: /var/run/tomcat.pid

case $1 in
sh /usr/share/tomcat7/bin/startup.sh
sh /usr/share/tomcat7/bin/shutdown.sh
sh /usr/share/tomcat7/bin/shutdown.sh
sh /usr/share/tomcat7/bin/startup.sh
exit 0

Set the permissions for the file:

sudo chmod 755 /etc/init.d/tomcat7

Add Tomcat to system startup as a service using the command:

sudo update-rc.d tomcat7 defaults

Now you can test that Tomcat is set up as a service using:

sudo service tomcat7 restart

Now to check everything is working on system startup reboot your machine using:

sudo reboot now

Navigate to “http://your.ip.add.ress:8080″ where the Tomcat home page should appear with no problems. Note: If you are having problems reaching your Tomcat home page make sure you have opened port 8080 on your server’s firewall.

It’s definitely worth reading some of the documentation on Tomcat, plenty of which is linked off your newly installed Tomcat home page. You should now have all you need to deploy your Java web applications as WAR files which is really easy using the manager GUI provided by Tomcat.

Install the latest Java 7 JDK on Ubuntu Linux Server 10.04 without apt-get

I was trying to set up Apache Tomcat on an older server running Ubuntu 10.04 and noticed that Java wasn’t actually installed by default. Also, the licensing agreement with Oracle seems to have changed and it is no longer possible to just use apt-get to install it. You have to manually accept the licensing agreement so even “wget” wont work any more (you just get a “download-fail-XXXXXXX.html” file instead.

First up, you have to go to the Java JDK download page and manually accept the licensing agreement. This must be done from a PC with a UI and a browser so no “wget”. You need to get the correct version, which in my case was the x86 tar.gz version. When this is downloaded you should have a “jdk-7u9-linux-i586.tar.gz” file which then needs to be copied to your Ubuntu server (I copied to my user’s home directory).

Now you have the file on your Ubuntu server you can extract it using:

tar -xvf jdk-7u9-linux-i586.tar.gz

This should give you a directory “jdk1.7.0_09″ which we need to move to somewhere sensible such as “/usr/lib/jvm/jdk1.7.0_09″:

sudo mv jdk1.7.0_09 /usr/lib/jvm/jdk1.7.0_09

Now we need to set up a symbolic link so that we can run Java from everywhere:

 sudo ln -fs /usr/lib/jvm/jdk1.7.0_09/bin/java /usr/bin/java

Now check that Java is all installed correctly by checking the version using:

java -version

Which in this case should give:

java version “1.7.0_09″
Java(TM) SE Runtime Environment (build 1.7.0_09-b05)
Java HotSpot(TM) Client VM (build 23.5-b02, mixed mode)

Now you can set up your JAVA_HOME variable at a system level so other applications can use Java by editing “/etc/environment”:

sudo nano /etc/environment

Now add the following line to point to your newly installed Java:


Now if you open up a new session (not your currect session) and type “echo $JAVA_HOME” you should see the path “/usr/lib/jvm/jdk1.7.0_09″ which means the variable has been set correctly.

Test your Polycom and Tandberg video conference hardware is working from your desktop using Polycom PVX

I needed to test that our (fairly ancient) Tandberg 6000 video conferencing system was working but our portable Polycom unit typically used for this purpose had failed. Trying to find a software alternative to do this is like pulling teeth as Polycom have discontinued their PVX software and replaced it with RealPresence Desktop and a suite of other software with no trial or free edition. The Polycom PVX software is the easiest way of testing our video conferencing hardware using software on the desktop but has not been updated since 2007.

Polycom PVX does still exist out there and a trial version can be downloaded from Polycom themselves (from their support site). You can also find the software to download from a few other sites such as Software Informer. With this trial edition you only get 5 minutes of video but it is enough to connect to your hardware and check everything is working. The interface is awful but it does work and the software connected to our Tandberg system fine using my computer’s webcam and mic etc.

Moving from wordpress.com to wordpress.org

I finally made the decision to move my blog to my own hosting after a few years of hosting on wordpress.com. There are a lot of good reasons to use wordpress.com such as the fantastic “googleability” of wordpress.com – it seems like search engines are excellent at searching through blogs hosted there. The disadvantage is that you are heavily restricted as to what you can do and you have to pay extra for a lot of features that I think should come standard.

First I downloaded wordpress, set up my database and ran the install to set up a wordpress.org installation on my own server. Then I went in to the admin panel in my wordpress.com site and went to “Store” then “Site Redirect”. It cost $13 for a year of redirection and I pointed the existing jamesrossiter.wordpress.com site at my new site blog.jamesrossiter.co.uk. This is a permanent redirect (302) so Google should pick up on this and update the listings for my site.

I then exported my site from wordpress.com by going to “Tools” -> “Export” and imported into my new site at blog.jamesrossiter.co.uk using “Tools” -> “Import” which included all the comments on the blog..

Finally, in order to make sure all my posts were correctly forwarded to the right URL I set the permalinks at blog.jamesrossiter.co.uk to be the same format as those used by blogs at wordpress.com which is “Day and name” and is set in “Settings” -> “Permalinks”.

Use PHP_Compat to use more recent PHP functions in old versions of PHP

As stated in my previous post, we needed to stick to PHP 5.2 rather than upgrade to a more recent version. One of the many downsides of this other than for security reasons is that you miss out on a large number of more useful functions.

In our case we missed the “str_getcsv()” function when reading in a .CSV file line by line. A really quick way to get around this, with or without using PEAR, was to download PHP_Compat, which includes a huge number of compatible functions from later versions of PHP.

If you have PEAR installed you can simply run the following to enable the functions:

pear install PHP_Compat

But if you are like us, running a version of Linux without PEAR installed by default you can just download the library directly. Once downloaded, just place in your web application directory (or elsewhere depending on configuration) and use a require_once() command in each PHP script for each function that you need, e.g.:


If you are on shared hosting or cannot get a more recent version of PHP, this library is a lifesaver.

Upgrade PHP from 5.1.6 to 5.2.17 on CentOS

The default install of PHP on our CentOS 5.5 box was 5.1.6, which is very out of date (we are currently using PHP 5.3 elsewhere while we figure out how to get around some very serious problems with 5.4). Unfortunately, we needed to upgrade to PHP 5.2 and no further as 5.3 meant upgrading MySQL and potentially breaking compatibility with our web application.

It used to be that you could add the CentOS testing repositories and just update PHP but as PHP 5.2 is depreciated this option is no longer available. The solution is to use the Atomic repositories which can be added to your CentOS install by typing:

wget -q -O – http://www.atomicorp.com/installers/atomic | sh

This will add a new repository file “/etc/yum.repos.d/atomic.repo” which means we can use their packages as well as those from CentOS. Now we need to make sure that we don’t upgrade our PHP beyond 5.2 so we add a single line to “/etc/yum.conf” under the [main] section:


The exclusion means we will include packages from all repositories other than anything that matches “php-*5.3*” so PHP 5.3 won’t be installed as part of an upgrade.

Now just upgrade PHP and restart Apache:

yum update php

service httpd restart

You can check which PHP version you have using:

php -v

Now obviously you want to use a more recent version of PHP than 5.2 but in the rare case where you have to, the previous commands make things very easy.

Easily add multiple lightbox style popup videos to your page using Flowplayer and jQuery Tools Overlay

Adding a video to your page is easy enough by embedding directly or by using something like flashembed to embed Flash. What we needed to do was have multiple links on a page that would each open specific videos, whilst still keep the same generic JavaScript somewhere in a separate page template file. The accompanying CSS also had to stay in a separate CSS file and also needed to be generic. Our videos were all in .FLV format and needed to pop over the content in a nice overlay so as not to disturb the flow of the page.

Blacktrash have a nice demo of the kind of functionality we needed but it wasn’t quite generic enough to manage our very different sized videos. The code below, based on this example using Flowplayer and jQuery Tools Overlay, is simple enough that you can easily add multiple videos to a page. The size of the video player is determined by the html5 style attributes “data-width” and “data-height” on each “a” link which are picked up by jQuery and used to resize the player. If you do not specify these then the default is used, as given by the CSS.

<!-- include the flowplayer, jQuery and jQuery Tools libraries -->
<script src="http://releases.flowplayer.org/js/flowplayer-3.2.11.min.js"></script>
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.7/jquery.js" type="text/javascript" charset="utf-8"></script>
<script src="http://cdn.jquerytools.org/1.2.6/all/jquery.tools.min.js" type="text/javascript" charset="utf-8"></script>

<!-- set up the flowplayer to play videos in an overlay using jQuery Tools Overlay -->
<script type="text/javascript">
  $(function () {
    var player;
      mask: {
        color: '#000',
        opacity: 0.2
      onLoad: function () {
		// create player object and load in the href of the link clicked as the source of the video
	    player = $f("player", "http://releases.flowplayer.org/swf/flowplayer-3.2.15.swf",this.getTrigger().attr("href"));
		// set the height and width based on data-height and data-width
      onClose: function () {

<!-- set up the CSS to hide the overlay by default and set up default height and width for the player -->
#overlay {
      display: none;
      padding: 40px;
    .close {
      background:url(http://flowplayer.org/media/img/overlay/close.png) no-repeat;
    #player {
      display: block;
      width: 600px;
      height: 300px;
      margin: 0;
    #player *:focus {
      outline-style: none;

<!-- each link contains a reference to the video to be played in flowplayer, the JavaScript above ensures they are played in an overlay -->
<a rel="#overlay" href="http://www.sitename.com/video/video1.flv">Video 1</a>
<a rel="#overlay" href="http://www.sitename.com/video/video800x600.flv" data-height="600" data-width="800">800x600 Video</a>

<!-- the overlay <div> itself is empty apart from a &amp;nbsp; and hidden using CSS to make sure it isn't shown on screen -->
<div id="overlay"><a class="close"></a><div id="player">&amp;nbsp;</div></div>


You can do a lot more with flowplayer and overlays so this is just a starting point but the approach is generic enough that any of the web team can add nice popup videos to a page just by adding a simple “a” link.

Quickly add large numbers of users to a computer or server using addusers.exe

addusers.exe is a great tool that is included in Windows 2000 Resource Kit and is described by Microsoft as “32-bit administrative utility that uses a comma-delimited text file to create, modify, and delete user accounts”. The main benefit is using something like Excel or notepad to create a list of users, passwords and other details and then instantly generating their user accounts. We had to do this recently for almost 100 user accounts and it took seconds once we had written the text file. Although this isn’t officially supported on Windows 2003 or later, it still worked for us.

You can download addusers.exe direct from Microsoft’s FTP server or from Petri.il.

The command we used to import a list of users contained in “listofusers.txt” to our server “servername” using addusers.exe was:

addusers /C C:listofusers.txt \servername /P:LE

Where the text file listofusers.txt looked something like:

username1,User Fullname,p4ssword,email@somesite.com
username2,User2 Fullname,p4ssword2,email2@somesite.com

You can do a lot more with addusers.exe but this is a very handy and quick way of managing the creation of large numbers of users.

Easily migrate content between SharePoint servers using stsadm.exe and fix the HTTP 401.1 site collection administrator local login problem

I needed to migrate content from one install of SharePoint Web Services 3.0 on Windows 2003 to another physical server and ran into an issue with checking my site locally before changing all the DNS records over. The actual export and import process is quite easy but can take a while if there are a lot of subsites or files within your SharePoint portal.

Migrating between SharePoint installs using stsadm.exe

The easiest way to export an entire SharePoint site is to use stsadm.exe, which is typically located in “C:Program FilesCommon FilesMicrosoft Sharedweb server extensions12BIN” and is available when you have installed SharePoint. I created a reference to this in the PATH variable so I could use it everywhere to make things easier. Thankfully the documentation is good (thanks Microsoft) and you can find more explicit details on exporting and importing from TechNet. Note that this is similar to other versions of SharePoint and more information on various methods of migration can be found on Microsoft Support, including moving databases directly. Chris O’Brian also has a good post about the various approaches.

The simple stsadm command I used to export my portal (including all files and subsites etc) at http://portal.sitename.com was:

stsadm -o export -url http://portal.sitename.com -filename sitename.bak

This produced a lot of ~30mb files and a good log of everything that took place. All the Active Directory user permissions were also included in the export, which was one of my big worries moving to a new server. For local users you have more of a problem as these don’t exist on the new server and need to be recreated.

To import your site back into another SharePoint install, you have to first make sure there is an existing web application and associated site collection (on the root “/”) before copying over all your exported files to the new server. The first time I tried I assumed it would regenerate the site collection based on my export from scratch, but apparently not. The command I used to import was:

stsadm -o import -url http://portal.sitename.com -filename sitename.bak

Now even though you will probably have a lot of files from the export process, you do not need to specify them all, just the main one, in this case “sitename.bak”. After a while your new site will be populated with all the content from your export and is ready for testing before you go live and change your DNS records to point to the new server.

Testing your newly migrated site locally, avoiding HTTP 401.1

As I was using remote desktop to access my new server to run the stsadm.exe import command I wanted to test the site locally by logging in with my site collection administrator details before changing the DNS over. To do this I set up a reference in my hosts file “C:Windowssystem32driversetchosts” on the new server to point http://portal.sitename.com to localhost ( then tried to visit http://portal.sitename.com within my remote desktop session. This is where I hit a HTTP 401.1 login error due to a security setting built into Windows 2003, even though I tried logging in with the correct site collection administrator details.

This is apparently a security fix to Windows Server 2003 SP1 that stops reflection attacks and according to Microsoft “authentication fails if the FQDN or the custom host header that you use does not match the local computer name”. The details on how to fix this are located at Microsoft Support and I’ve noted the easiest way to fix this by removing the loopback check entirely.

First you need to disable strict name checking by editing the registry on your server. Open the registry editor (run regedit.exe) and go to “HKEY_LOCAL_MACHINESystemCurrentControlSetServicesLanmanServerParameters”. Now click “Edit -> New -> DWORD Value” and name it “DisableStrictNameChecking” and then right click and set it to decimal “1″.

Now go to “HKEY_LOCAL_MACHINESYSTEMCurrentControlSetControlLsa” and click “Edit -> New -> DWORD Value” and name it “DisableLoopbackCheck” and then right click and set it to decimal “1″ as well.

You need to restart your server for the changes to have any effect and once you have you should be able to log in to your local site at http://portal.sitename.com without hitting a HTTP 401.1 error with your site collection administrator details. Now you can test out the site before changing the DNS records to point to the new server and removing your host file record.

Easily connect and use PHP with SharePoint lists using cURL, RSS and NTLM authentication

Connecting to SharePoint from PHP is actually not that difficult if you have the cURL extension installed on your web server. In the case of my XAMMP windows development server I just made sure the following line in php.ini (c:xammpphpphp.ini in my case) was uncommented before restarting Apache:


In Ubuntu/Linux you can usually just install the packages for cURL and after restarting Apache it will become available. Just type the following on the command line:

sudo apt-get install curl libcurl3 libcurl3-dev php5-curl

Then restart Apache

sudo /etc/init.d/apache2 restart

Now the following code comes from both Norbert Krupa’s comment on David’s IT Blog and a question about parsing HTML on StackOverflow. The important thing to note is that I needed to use cURL to authenticate my domain user when connecting to my secure SharePoint Services 3.0 test site. Apparently you can get away without using cURL on sites that don’t need authentication but the same cURL code listed below can be used with a blank username and password for the same effect.

The goal of this listing is to connect to SharePoint using a domain user (can also be a local user if SharePoint is set up that way) and retrieve the contents of a SharePoint list. The trick is to supply the RSS feed url, which allows PHP to parse the RSS feed and neatly list the contents of a SharePoint list. An advantage of using RSS feeds of SharePoint lists is that they are secured using the same method as the list itself and require no extra configuration on the SharePoint side of things. You can also set the RSS feed to only show a set number of items or days, which is useful for regularly updated lists.

// generic function to get the contents of an HTML block
function get_inner_html( $node ) {
    $innerHTML= '';
    $children = $node->childNodes;
    foreach ($children as $child) {
        $innerHTML .= $child->ownerDocument->saveXML( $child );
    return $innerHTML;

// username and password to use
$pwd = 'PASSWORD';
// URL to fetch, this is the address of the RSS feed (go into a list and click "Actions" -> "View RSS Feed" to get the url)
$url = "http://www.sharepointsite.com/_layouts/listfeed.aspx?List=%7BCED7CDDC-49C0-4C46-BDE6-CFC2BA993C84%7D";
//Initialize a cURL session
$curl = curl_init();
//Return the transfer as a string of the return value of curl_exec() instead of outputting it out directly
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
//Set URL to fetch
curl_setopt($curl, CURLOPT_URL, $url);
//Force HTTP version 1.1
//Use NTLM for HTTP authentication
//Username:password to use for the connection
curl_setopt($curl, CURLOPT_USERPWD, $usr . ':' . $pwd);
//Stop cURL from verifying the peer’s certification
curl_setopt($curl, CURLOPT_SSL_VERIFYPEER, false);
//Execute cURL session
$result = curl_exec($curl);
//Close cURL session

$xml = simplexml_load_string($result);

// display results on screen
foreach($xml->channel->item as $Item){
    echo "<br/>($Item->title)";
    $doc = new DOMDocument();
    $ellies = $doc->getElementsByTagName('div');
    foreach ($ellies as $one_el) {
        if ($ih = get_inner_html($one_el))
            echo ", $ih";

The SharePoint RSS feed is a little interesting as the “$Item->title” object is the main column in the list but the rest of the list is encapsulated in <div> within “$Item->description”, hence the requirement to parse the html.

For a SharePoint list with 3 columns the output will look something like:

(Item 1 Title) , Other Column A: xxxx, Other Column B: yyyy
(Item 2 Title) , Other Column A: zzzz, Other Column B: kkkk

Now the potential for this is great as it allows us to securely synchronise SharePoint lists with external databases, use SharePoint for authenticating PHP applications etc . We are going to be using this for automatically pulling users from a SharePoint list to populate a separate PHP application, whilst keeping user-identifiable data locked away on SharePoint.