Web analytics professionals use various tools to collect data and monitor customer visits. For websites, the following methods are primarily used to capture data:
A web log file is created on the server that hosts the website. This file stores all the website request information including the IP address, the date and time of the request, the time taken to process the request, the number of bytes of data transferred, and the referral URL. Each entry is typically one line of text for every request made. Some of the primary reasons to use web logs for analytics include the following:
- Traffic history of website visitors is available for analysis. This includes information about time spent by users on the site, navigation history per user of the website, bounce rates, and other valuable information.
- Traffic information from both humans and bots is collected in the web logs. Understanding the behavior of bots and search engine spiders helps in dealing with difficult issues related to optimizing websites for search results.
- Information on all possible site errors is also tracked, which helps in identifying any broken links, missing pages, or other errors.
This method of data collection involves adding a small code snippet in all pages of the website. Most of the analytics tools use this method. For every page visit, the code snippet is activated and data is sent to the tool’s server. These tools usually tag every visitor with a cookie. Advantages of using this method include the following:
- Data is collected from every website visit, unlike web log data that can be affected by cached pages by the browser. If a page is cached on a user’s machine, visits to that page will not be registered in the server log file. Recording of repeated visits to cached pages increases the accuracy of information gathered from log files.
- Events on the web page that do not require a request to the server, such as mouse clicks, mouse overs and other multimedia elements, are tracked.
- Additional information of the browser or user’s system such as screen size, resolution, and operating system is captured.
- Information of bots or search engine spiders, which generate high traffic but are not representative of genuine visitor behavior, is not captured.
- Operating costs are low because the traffic data is captured and maintained by external service providers. This also reduces the time spent on maintaining and handling any issues related to data storage.
Here is an example of Web Analytics: