Page 296 - DCAP101_BASIC_COMPUTER_SKILLS
P. 296

Unit 14: Web Server Applications


                      XSS viruses can cause high traffic because of millions of infected browsers and/or Web  Notes
                      servers;
                      Internet bots. Traffic not filtered/limited on large web sites with very few resources
                      (bandwidth, etc.);
                      Internet (network) slowdowns, so that client requests are served more slowly and the
                      number of connections increases so much that server limits are reached;
                      Web servers (computers) partial unavailability. This can happen because of required
                      or urgent maintenance or upgrade, hardware or software failures, back-end (e.g.,
                      database) failures, etc.; in these cases the remaining web servers get too much traffic
                      and become overloaded.

                 14.1.6 Overload Symptoms
                 The symptoms of an overloaded Web server are:
                      requests are served with (possibly long) delays (from 1 second to a few hundred
                      seconds);
                      500, 502, 503, 504 HTTP errors are returned to clients (sometimes also unrelated 404
                      error or even 408 error may be returned);
                      TCP connections are refused or reset (interrupted) before any content is sent to clients;
                      in very rare cases, only partial contents are sent (but this behavior may well be
                      considered a bug, even if it usually depends on unavailable system resources).
                 14.1.7 Anti-overload Techniques

                 To partially overcome above load limits and to prevent overload, most popular Web sites
                 use common techniques like:
                      managing network traffic, by using:
                         Firewalls to block unwanted traffic coming from bad IP sources or having bad
                          patterns;
                         HTTP traffic managers to drop, redirect or rewrite requests having bad HTTP
                          patterns;

                         Bandwidth management and  traffic shaping, in order to smooth down peaks in
                          network usage;
                      deploying  Web cache techniques;
                      using different domain names to serve different (static and dynamic) content by
                      separate Web servers, i.e.:
                         http://images.example.com
                         http://www.example.com
                      using different domain names and/or computers to separate big files from small and
                      medium sized files; the idea is to be able to fully cache small and medium sized files
                      and to efficiently serve big or huge (over 10 - 1000 MB) files by using different settings;
                      using many Web servers (programs) per computer, each one bound to its own network
                      card and IP address;
                      using many Web servers (computers) that are grouped together so that they act or are
                      seen as one big Web server (see also Load balancer);
                      adding more hardware resources (i.e. RAM, disks) to each computer;
                      tuning OS parameters for hardware capabilities and usage;





                                                   LOVELY PROFESSIONAL UNIVERSITY                                  289
   291   292   293   294   295   296   297   298   299   300   301