Page 195 - E-Bussiness and E-Commerce Management Strategy, Implementation, and Practice
P. 195
M03_CHAF9601_04_SE_C03.QXD:D01_CHAF7409_04_SE_C01.QXD 16/4/09 11:09 Page 162
162 Part 1 Introduction
Box 3.6 How long before you become impatient?
Usability specialist Jacob Nielsen noted, early on in the life of the web (Nielsen, 1994)
that the basic advice for response times for human–computer interaction has been
about the same for thirty years. He describes these requirements for response from
any computer system:
0.1 second is about the limit for having the user feel that the system is reacting
instantaneously, meaning that no special feedback is necessary except to display
the result.
1.0 second is about the limit for the user’s flow of thought to stay uninterrupted, even
though the user will notice the delay. Normally, no special feedback is necessary
during delays of more than 0.1 but less than 1.0 second, but the user does lose the
feeling of operating directly on the data.
10 seconds is about the limit for keeping the user’s attention focused on the dialogue.
For longer delays, users will want to perform other tasks while waiting for the
computer to finish, so they should be given feedback indicating when the computer
expects to be done. Feedback during the delay is especially important if the response
time is likely to be highly variable, since users will then not know what to expect.
Speed of access of a customer, employee or partner to services on an e-business server is
determined by both the speed of server and the speed of the network connection to the
server. The speed of the site governs how fast the response is to a request for information
from the end-user. This will be dependent on the speed of the server machine on which the
web site is hosted and how quickly the server processes the information. If there are only a
small number of users accessing information on the server, then there will not be a notice-
able delay on requests for pages. If, however, there are thousands of users requesting
information at the same time then there may be a delay and it is important that the combi-
nation of web server software and hardware can cope. Web server software will not greatly
affect the speed at which requests are answered. The speed of the server is mainly controlled
by the amount of primary storage (for example, 1024 Mb RAM is faster than 512 Mb RAM)
and the speed of the magnetic storage (hard disk). Many of the search-engine web sites now
store all their index data in RAM since this is faster than reading data from the hard disk.
Companies will pay ISPs according to the capabilities of the server.
As an indication of the factors that affect performance, the DaveChaffey.com website has
a shared plan from the hosting provider which offers:
2400 GB bandwidth
200 MB application memory
60 GB disk space (this is the hosting capacity which doesn’t affect performance).
Dedicated server An important aspect of hosting selection is whether the server is dedicated or shared (co-
Server only contains located). Clearly, if content on a server is shared with other sites hosted on the same server
content and applications
for a single company. then performance and downtime will be affected by demand loads on these other sites. But a
dedicated server package can cost 5 to 10 times the amount of a shared plan, so many small
and medium businesses are better advised to adopt a shared plan, but take steps to minimize
the risks with other sites going down.
For high-traffic sites, servers may be located across several computers with many processors
to spread the demand load. New distributed methods of hosting content, summarized by
Spinrad (1999), have been introduced to improve the speed of serving web pages for very large
corporate sites. These methods involve distributing content on servers around the globe, and
the most widely used service is Akamai (www.akamai.com). These are used by companies such
as Yahoo!, Apple and other ‘hot-spot’ sites likely to receive many hits.