Things move on

As regular readers of this magazine will have noticed by now, there have been a few changes to the Real World Computing section, not least to the pictures of us contributors – it’s rumoured that some of them are being cut out and used to frighten horses and small children. The demise of the Web Business column I wrote with Paul Ockenden is sad, but things must move on, and nowhere is this more true than in the world of the internet. When we first started that column, e-commerce was difficult to do and it was impossible to get a secure certificate unless you had an American bank account, while DHTML was still a twinkle in the W3C’s eye.

Things move on

How times have changed: now e-commerce is available to anyone with a PayPal account, and simple cut-and-paste gives you a shopping basket. Forums, content management, diaries, blogs and image store add-ons are all available free, and the list just goes on. It looks as though things have got a whole lot easier, and in some ways that’s true – you can now build an acceptable store without writing any code using the many free hosting systems out there. The problem is that user and client expectations have increased enormously to match, and it’s becoming increasingly difficult to come up with something different. The competition is huge and the wealth of options for the developer and IT manager more confusing than ever.

Browser incompatibilities have – thanks to style sheets – reared their ugly head again after a quiet spell while Internet Explorer ruled the roost and Apple users remained in posh seclusion. Now Apple-based browsers represent a significant number of visitors to most stores, and the differences in the way the various browsers render style sheets will give me a pile to write about in this column. The move away from straightforward HTML to XML and the use of “proper” programming languages for web applications, combined with the increasing use of web services, means we need to reassess the way we build a particular website. There’s still a place for the traditional static site maintained using products like Dreamweaver, but parts of them may need to be replaced by web applications that perform some particular function in a more pleasing way than a jerky sequence of static pages can.

It’s this new environment that led PC Pro to rename this column, shifting the emphasis from pure e-commerce business to web applications of all kinds. This doesn’t mean the new column will be all code and programming, though: Kevin Partner and I will cover all aspects of running a site, along with anything else you tell us you might be interested in. Meanwhile, Paul has moved over to the Mobile & Wireless column, as a confessed laptop addict, and I’ll be picking his brains frequently on such topics, as I plan to be writing from time to time on remote Scottish Islands (another love of mine). In short, Web Applications isn’t just going to be about writing web-based applications, but about how to apply the web to make it work for you and your business. I hope you’ll find it useful and interesting, and please email me with any feedback you have.

Makeover in ASP.NET

First off this month, I thought I’d show how approaching an existing website and applying web application ideas to it can lead to a better customer experience and a slicker look. I’ll be keeping the code on these pages as simple as possible to highlight those bits needed to make things work, rather than obscure it with lots of error-checking code, so apologies to any hard-core programmers. As it happens, this example doesn’t involve any code writing, just a little customising of a query at the end, such is the power of the development tool I’ll be using.
The website I’m giving this makeover to is a live site that contains a directory of engineering companies. The original site can be viewed at www.engineeringonline.co.uk, while the new version is at www.engineeringonline.co.uk Its design was fairly standard stuff: three drop-downs populated from database tables, from which the user can choose a company type and, as they select an option from one drop-down box, the others get populated appropriately from the database. This requires a new query and a round trip to the server, and that in turn means blanking and redrawing the browser window. Such a site was just crying out for a makeover using something like Flash, AJAX or, in this case, ASP.NET 2.

Like many other developers, I saw little point in recoding all our ASP 3 websites in ASP.NET 1, as all it delivered was a lot of grief for little reward in many cases. However, with version 2 and the new development tools ASP.NET now offers, there are very good reasons to reconsider that refusal. The feature I’ll be looking at here is called postback, which enables a web page to requery a database and use the returned record set to redraw areas of the page without causing that annoying flicker as the browser re-renders the whole page (much the same way that AJAX works). To code this, I’m going to be using the new Microsoft development tool Visual Web Developer 2005 Express Edition.

In the past, a valid reason for not using Microsoft database and development tools was their sheer cost, but this has all changed with SQL server 2005 Express and Visual Web Developer 2005 Express Edition, both of which are free. I shall be using them where possible in place of their Enterprise cousins, to show what can be done with these basic tools. I’m going to describe a simplified version of the website I revamped, which is easier to explain and will give the same effect. You should end up with a web page with a drop-down list fed from a database, so that when a user clicks on this a detail area opens up on the page showing more info from the database – and these details get redrawn without any blanking.

First, open a new web application project in Visual Web Developer 2005 Express Edition, which will create all the support files needed, as well as a blank default.aspx page. Onto this page, drag a DropDownList from the toolbox and, as you do this, it will ask you to create a new data source – point this to your database, then, following through with the wizard, build a query that will return the data to display in the drop-down list. Make sure you check the Enable AutoPostBack box on the list control, as by doing this you’ll be enabling the AJAX-style functionality that stops the web page from blanking every time the data changes. It really is as easy as that.

Now you need some way of displaying the details from the record the user has selected via the drop-down list, so the next stage is to drag and drop a DetailsView control onto the page in which to display these details. This control will receive its data from a different data control that will contain only a single record, having been filtered using the selection the user made in the drop-down list. Hence, you need to select New Datasource from the Tasks panel for this control and create a query that will return just those database fields you want to show in your Details view. Once you’ve done this, you’ll need to add a WHERE clause to your new query that takes its parameters from the current selection in the drop-down list.
This is the only place where you’ll need to do a little manual coding. In the properties of this second datasource, go to the SelectQuery property – clicking this will open the Command and Parameter Editor window. There, by clicking on the Add parameter button and selecting Control as the Parameter Source and selecting your drop-down list from the ControlID list, you’ll see a parameter being added to the Parameters panel. Now comes the only bit of actual typing you have to do – change the name of this parameter from “newparameter” to something more meaningful. I’ve called it SelName in this example. The final step is to add the all-important WHERE clause to the Select command panel, so you change:

SELECT * FROM [Companies]

to:

SELECT * FROM [Companies] where company_name = @SelName

This syntax should be familiar to anyone who’s programmed in SQL Server: the @SelName term passes the variable value to the query. Once you’ve selected OK, all that’s left to do is press F5 to run your application in test mode.

It’s worth mentioning at this point that so long as the config.web file has the necessary entry, you’ll be running with full debugging facilities, including breakpoints and watches, on this web application, just as you’ve become used to when using Visual Basic and similar development tools. The required key in config.web should look like this:

However, it’s important to make sure on a live website that you set this debugging flag back to false in config.web, or the performance of your application will suffer considerably. If as sysadmin you want to prevent any of your developers from accidentally turning on debugging for the live server, which could give hackers detailed information about your site, add the following key to the machine.config file:

This will disable all debugging capabilities on the server, as well as remote viewing of error messages. This file is normally stored in the C:\Windows\Microsoft.NET\Framework\v2.0.50727\CONFIG folder on most machines (where v2.0.50727 is the version number of the .NET Framework and so may be different on your server).

What a relief it is to have this level of debugging available in a web tool, as it makes development so much easier, particularly as some of the errors thrown by .NET 2 can be obscure to say the least. Whenever I get stuck on a problem, first I search the net and if that doesn’t show anything useful I turn to Experts Exchange (www.experts-exchange.com), and often a message posted on there will reveal an answer. There’s no shame in using such systems if you get stuck: after all, as Magnus Magnusson said when asked about the difficulty of the questions on Mastermind: “An easy question is just one you know the answer to.”

Am I popular?

After building a website and taking it live, the next crucial stage is to monitor the number of visitors to your site. It’s important to set up this facility before you start promoting the site, as otherwise how are you going to tell whether that expensive advertising campaign (or cheap email shot) has done you any good? I’m not talking here about adding one of those dreadful hit counters you see on some sites; what’s needed is an analysis tool that provides a far more detailed view of what on the site is being looked at, for how long and by whom.
Such tools fall into two main groups. The first type imports your web server’s log files into a database and then allows you to analyse the information in various ways. This type is popular with marketing departments that love to drill down into the data and to examine it along different axes. The second type of analysis tool processes the log files directly without any intervening database and generates its report usually as a web page. This process can be scheduled to run at specific times, and it’s this second type I tend to use for our own clients. In the past, I’ve always used WebTrends (www.webtrends.com), which, despite having its critics, has worked well for us over the years, producing pages like www.ecatsbridge.com. However, the pricing structure for WebTrends has changed somewhat recently, and no longer does it favour sites with large traffic and low income, since the licensing cost is now calculated on the number of page impressions rather than the number of sites. So I’ve been casting around for an alternative and, while it isn’t quite what I’m looking for myself, it’s definitely worth visiting Click Tracks’ free product Appetizer (www.clicktracks.com) if you need to analyse a set of log files from a web server in a way non-technical marketing persons can understand. This product, once it’s imported your log files into its database, allows you to browse the data doing difference analyses on the information. Also, it has a particularly cool way of showing the popularity of each link on a preview of the relevant web page. Obviously, the firm hopes you’ll be so taken with the product you’ll upgrade to one of its more expensive products, but as it stands some useful information can be gleaned from Appetizer’s analyses.

Greedy software

While on the subject of web server logs and activity, there’s a class of products that’s currently causing problems over excess bandwidth usage, as well as making your websites appear much busier than they actually are. We’ve all seen the adverts for these “speed up your browsing” products, which often work simply by following the links on the current page you’re looking at and loading these further pages in the background, acting as a form of local cache, so that when you actually click on a link that page is already downloaded in your browser. This procedure is called pre-fetching.

The problem with this behaviour from a website owner’s point of view is that many hosting companies nowadays charge by the amount of data downloaded to the browser, and this can suddenly increase enormously with the use of such products thanks to hundreds of pages pre-fetched but never viewed. A site can also be rendered less responsive because of the increased number of requests, the majority of which are never needed. Exactly this happened with the release of Fasterfox, but at least this accelerator product offers a way to prevent it from doing this on your site. You’re probably already aware of the trick of placing a text file in the root of your website called robots.txt that instructs most search engines to ignore certain folders. Well, placing an entry into this file will also stop Fasterfox from trawling your website, and the required entry is as follows:

User-agent: Fasterfox

Disallow: /

It’s as simple as that. However, if you want to stop a product like Google’s Web Accelerator from pre-fetching, you can’t using this method because it totally ignores the robots.txt file (flying against all conventions). If you want to see whether your website is suffering from such extra traffic, look for “X-moz: prefetch” in your log files. As for stopping it, well that might be a topic for a future column.

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.