ANNEX B.5 Impact assessment & Web-related general analytics

From wiki.gpii
Jump to: navigation, search

There are no globally agreed definitions within web analytics as the industry bodies have been trying to agree on definitions that are useful and definitive for some time. The main bodies who have had input in this area have been JICWEBS (The Joint Industry Committee for Web Standards in the UK and Ireland), ABCe (Audit Bureau of Circulations electronic, UK and Europe), The DAA (Digital Analytics Association), formally known as the WAA (Web Analytics Association, US) and to a lesser extent the IAB (Interactive Advertising Bureau). However, many terms are used in consistent ways from one major analytics tool to another, so the following list, based on those conventions, can be a useful starting point. Both the WAA and the ABCe provide more definitive lists for those who are declaring their statistics as using the metrics defined by either.

Hit - A request for a file from the web server. Available only in log analysis. The number of hits received by a website is frequently cited to assert its popularity, but this number is extremely misleading and dramatically overestimates popularity. A single web-page typically consists of multiple (often dozens) of discrete files, each of which is counted as a hit as the page is downloaded, so the number of hits is really an arbitrary number more reflective of the complexity of individual pages on the website than the website's actual popularity. The total number of visits or page views provides a more realistic and accurate assessment of popularity.

Page view - A request for a file, or sometimes an event such as a mouse click, that is defined as a page in the setup of the web analytics tool. An occurrence of the script being run in page tagging. In log analysis, a single page view may generate multiple hits as all the resources required to view the page (images, .js and .css files) are also requested from the web server.

Event - A discrete action or class of actions that occurs on a website. A page view is a type of event. Events also encapsulate clicks, form submissions, keypress events, and other client-side user actions.

Visit / Session - A visit or session is defined as a series of page requests or, in the case of tags, image requests from the same uniquely identified client. A visit is considered ended when no requests have been recorded in some number of elapsed minutes. A 30 minute limit ("time out") is used by many analytics tools but can, in some tools, be changed to another number of minutes. Analytics data collectors and analysis tools have no reliable way of knowing if a visitor has looked at other sites between page views; a visit is considered one visit as long as the events (page views, clicks, whatever is being recorded) are 30 minutes or less closer together. Note that a visit can consist of one page view, or thousands.

First Visit / First Session - (also called 'Absolute Unique Visitor' in some tools) A visit from a uniquely identified client that has theoretically not made any previous visits. Since the only way of knowing whether the uniquely identified client has been to the site before is the presence of a persistent cookie that had been received on a previous visit, the First Visit label is not reliable if the site's cookies have been deleted since their previous visit.

Visitor / Unique Visitor / Unique User - The uniquely identified client that is generating page views or hits within a defined time period (e.g. day, week or month). A uniquely identified client is usually a combination of a machine (one's desktop computer at work for example) and a browser (Firefox on that machine). The identification is usually via a persistent cookie that has been placed on the computer by the site page code. An older method, used in log file analysis, is the unique combination of the computer's IP address and the User Agent (browser) information provided to the web server by the browser. It is important to understand that the "Visitor" is not the same as the human being sitting at the computer at the time of the visit, since an individual human can use different computers or, on the same computer, can use different browsers, and will be seen as a different visitor in each circumstance. Increasingly, but still somewhat rarely, visitors are uniquely identified by Flash LSO's (Local Shared Object), which are less susceptible to privacy enforcement.

Repeat Visitor - A visitor that has made at least one previous visit. The period between the last and current visit is called visitor recency and is measured in days.

New Visitor - A visitor that has not made any previous visits. This definition creates a certain amount of confusion (see common confusions below), and is sometimes substituted with analysis of first visits.

Impression - The most common definition of "Impression" is an instance of an advertisement appearing on a viewed page. Note that an advertisement can be displayed on a viewed page below the area actually displayed on the screen, so most measures of impressions do not necessarily mean an advertisement has been viewable.

Single Page Visit / Singleton - A visit in which only a single page is viewed (a 'bounce').

Bounce Rate - The percentage of visits that are single page visits.

Exit Rate / % Exit - A statistic applied to an individual page, not a web site. The percentage of visits seeing a page where that page is the final page viewed in the visit.

Page Time Viewed / Page Visibility Time / Page View Duration - The time a single page (or a blog, Ad Banner...) is on the screen, measured as the calculated difference between the time of the request for that page and the time of the next recorded request. If there is no next recorded request, then the viewing time of that instance of that page is not included in reports.

Session Duration / Visit Duration - Average amount of time that visitors spend on the site each time they visit. This metric can be complicated by the fact that analytics programs cannot measure the length of the final page view.

Average Page View Duration - Average amount of time that visitors spend on an average page of the site.

Active Time / Engagement Time - Average amount of time that visitors spend actually interacting with content on a web page, based on mouse moves, clicks, hovers and scrolls. Unlike Session Duration and Page View Duration / Time on Page, this metric can accurately measure the length of engagement in the final page view, but it is not available in many analytics tools or data collection methods.

Average Page Depth / Page Views per Average Session - Page Depth is the approximate "size" of an average visit, calculated by dividing total number of page views by total number of visits.

Frequency / Session per Unique - Frequency measures how often visitors come to a website in a given time period. It is calculated by dividing the total number of sessions (or visits) by the total number of unique visitors during a specified time period, such as a month or year. Sometimes it is used interchangeable with the term "loyalty."

Click path- the chronological sequence of page views within a visit or session.

Click - "refers to a single instance of a user following a hyperlink from one page in a site to another".

Site Overlay is a report technique in which statistics (clicks) or hot spots are superimposed, by physical location, on a visual snapshot of the web page.

 

Direct sources for the evaluation materials of this section are: usability.net, allaboutux.org, alexa.com

 


Table 18: Impact assessment measures and techniques

High Level Evaluation Objective

Key Indicators

Evaluation techniques/Measuring ways

Measuring tools

Success targets/thresholds

Popularity/Usage (utility & usability metrics)

Number of frequent participants (per different category of actor)

Through access and use of the actual ecosystem infrastructure. It is anticipated as “field testing” in the context of the HF assessment – it corresponds to real life use of the ecosystem on behalf of all kind of actors.

Automatic logging mechanisms

 

High and representative number of users across most categories of actors identified

*representativeness as a quality and quantifiable index will probably be provided in the impact assessment plan (D404.1) and the business models developed within SP1

Rate of new-comers (per different category of actor)

Automatic logging mechanisms

 

Representative number of new-comers from most categories of actors

Number of subscribers/number of unique visits >3

Number of downloads

Automatic logging mechanisms

Number of downloads should happen at least once for a unique visitor

Successfulness

Percentage of successful interactions/transaction

Automatic logging mechanisms

Successful/complete interactions/unsuccessful/incomplete>3

User acceptance

User acceptance attributes

Online feedback forms

Self-reported acceptance over 5 (7-Likert scale)

Quality of life (of all different participating parties)

Level of satisfaction (of all different participating parties)

Built-in on-line feedback forms

Self-reported increase in perceived satisfaction (above 3 in 5-Likert scale)

Sustainability

Maintenance effort/cost

Feedback forms (internal) targeting platform operators and SP2ers

Self-reported minimal maintenance effort and cost (equal two and below in 5-Likert scale)

Content transferability effort/cost

Self-reported minimal effort and cost for transferring/duplicate in another host effort (equal two and below in 5-Likert scale)

Growth potential

Rate of new-comers (per different category of actor)

Automatic logging mechanisms

Estimated 10% increase of new-comers on monthly basis compared to current visitors (this remains to be validated by the models to be developed in SP1)

Competitiveness

What is better/worse than other systems (strengths/weaknesses)?

Built-in on-line feedback forms

Increased perceived competitiveness of the platform compared to others (3 out of 5 on a 5-Likert scale)

Cost-effectiveness

Number of downloads vs. number of complaints

Built-in on-line feedback forms & Automatic logging mechanisms

Increasing rate of downloads with increasing visitors’/subscribers’ rate (expected good correlation)

Number of complaints is less than 5% of  number of visits

Organizational quality

Percentage of successful interactions & number of complaints

Built-in on-line feedback forms & Automatic logging mechanisms

Successful interactions are reported in more than 75% of any recorded interactions

Number of complaints is less than 5% of  number of visits

Globalisation

Percentage of successful interactions & number of downloads & rates of participants/new comers per continent/country

Automatic logging mechanisms

Successful interactions are reported in more than 75% of any recorded interactions per continent and per country

Number of complaints is less than 5% of  number of visits per continent and per country

Number of downloads occurs at least in one visit per visitor/subscriber per continent and country

Privacy/Security

Number/severity of relevant complaints

Built-in on-line feedback forms

Number of reported complaints is and remains minimal (e.g. in less than 5% of visits)

Level of severity remains low in 85% of all reported complaints

Financial performance

Overall conversion rate

 

Platform analytics

Number of visitors completing a transaction/order should be at least 60% of visitors

Transaction abandonment rate

Number of visitors started but not completed a transaction should be less than 25% of those successfully completing a transaction

Tools/products per transaction

This target cannot be defined; depends on the number of available project outcomes and the number of anticipated external products to be added/shared

Strength and stability methodology and final predictions (overview of ecosystem)

Predictive validity

 

Online feedback forms for end-users and stakeholders and internal feedback forms for project partners and platform operators

High correlation among different types of impact metrics for each high level objective (if and wherever applicable)

Predictive reliability

High correlation of impact metrics with platform analytics across continents/countries and groups of actors

Predictive sensitivity

For data collected by the platform analytics, could be partially based on the following method:

[(i.e.) True Positives/True Positive + False Negatives]

Predictive specificity

 

For data collected by the platform analytics, could be partially based on the following method:

 [(i.e.) True Negatives/False Positive +True Negative]