diff options
| author | Mitja Felicijan <mitja.felicijan@gmail.com> | 2022-08-27 14:05:48 +0200 |
|---|---|---|
| committer | Mitja Felicijan <mitja.felicijan@gmail.com> | 2022-08-27 14:05:48 +0200 |
| commit | 9f5454bda6299db43a4e9de5b3716471388b81d9 (patch) | |
| tree | 1ceedf64a4517a372d70efc2b6f4bbd9478ce792 /content/posts/2021-01-25-goaccess.md | |
| parent | e728c3a2cbd06d95cd1226d3b23473816bd0d67e (diff) | |
| download | mitjafelicijan.com-9f5454bda6299db43a4e9de5b3716471388b81d9.tar.gz | |
Move blog to Hugo
Diffstat (limited to 'content/posts/2021-01-25-goaccess.md')
| -rw-r--r-- | content/posts/2021-01-25-goaccess.md | 164 |
1 files changed, 164 insertions, 0 deletions
diff --git a/content/posts/2021-01-25-goaccess.md b/content/posts/2021-01-25-goaccess.md new file mode 100644 index 0000000..2f3d56b --- /dev/null +++ b/content/posts/2021-01-25-goaccess.md | |||
| @@ -0,0 +1,164 @@ | |||
| 1 | --- | ||
| 2 | title: Using GoAccess with Nginx to replace Google Analytics | ||
| 3 | url: using-goaccess-with-nginx-to-replace-google-analytics.html | ||
| 4 | date: 2021-01-25 | ||
| 5 | draft: false | ||
| 6 | --- | ||
| 7 | |||
| 8 | **Table of contents** | ||
| 9 | |||
| 10 | 1. [Opting for log parsing](#opting-for-log-parsing) | ||
| 11 | 2. [Getting Nginx ready](#getting-nginx-ready) | ||
| 12 | 3. [Getting GoAccess ready](#getting-goaccess-ready) | ||
| 13 | 4. [Securing with Basic authentication](#securing-with-basic-authentication) | ||
| 14 | |||
| 15 | I know! You cannot simply replace Google Analytics with parsing access logs and displaying a couple of charts. But to be honest, I actually never used Google Analytics to the fullest extent and was usually interested in seeing page hits and which pages were visited most often. | ||
| 16 | |||
| 17 | I recently moved my blog from Firebase to a VPS and also decided to remove Google Analytics tracking code from the site since its quite malicious and tracks users across other pages also and is creating a profile of a user, and I've had it. But I also need some insight of what is happening on a server and which content is being read the most etc. | ||
| 18 | |||
| 19 | I have looked at many existing solutions like: | ||
| 20 | - [Umami](https://umami.is/) | ||
| 21 | - [Freshlytics](https://github.com/sheshbabu/freshlytics) | ||
| 22 | - [Matomo](https://matomo.org/) | ||
| 23 | |||
| 24 | But the more I looked at them the more I noticed that I am replacing one evil with another one. Don't get me wrong. Some of these solutions are absolutely fantastic but would require installation of databases and something like PHP or Node. And I was not ready to put those things on my fresh server. Also having Docker installed is out of the question. | ||
| 25 | |||
| 26 | ## Opting for log parsing | ||
| 27 | |||
| 28 | So, I defaulted to parsing already existing logs and generating HTML reports from this data. | ||
| 29 | |||
| 30 | I found this amazing software [GoAccess](https://goaccess.io/) which provides all the functionalities I need, and it's a single binary. Written in Go. | ||
| 31 | |||
| 32 | GoAccess can be used in two different modes. | ||
| 33 | |||
| 34 |  | ||
| 35 | <center><i>Running in a terminal</i></center> | ||
| 36 | |||
| 37 |  | ||
| 38 | <center><i>Running in a browser</i></center> | ||
| 39 | |||
| 40 | I, however, need this to run in a browser. So, the second option is the way to go. The Idea is to periodically run cronjob and export this report into a folder that gets then server by Nginx behind a Basic authentication. | ||
| 41 | |||
| 42 | ## Getting Nginx ready | ||
| 43 | |||
| 44 | I choose Ubuntu on [DigitalOcean](https://www.digitalocean.com/). First I installed [Nginx](https://nginx.org/en/), and [Letsencrypt](https://letsencrypt.org/getting-started/) certbot and all the necessary dependencies. | ||
| 45 | |||
| 46 | ```sh | ||
| 47 | # log in as root user | ||
| 48 | sudo su - | ||
| 49 | |||
| 50 | # first let's update the system | ||
| 51 | apt update && apt upgrade -y | ||
| 52 | |||
| 53 | # let's install | ||
| 54 | apt install nginx certbot python3-certbot-nginx apache2-utils | ||
| 55 | ``` | ||
| 56 | |||
| 57 | After all this is installed we can create a new configuration for a statistics. Stats will be available at `stats.domain.com`. | ||
| 58 | |||
| 59 | ```sh | ||
| 60 | # creates directory where html will be hosted | ||
| 61 | mkdir -p /var/www/html/stats.domain.com | ||
| 62 | |||
| 63 | cp /etc/nginx/sites-available/default /etc/nginx/sites-available/stats.domain.com | ||
| 64 | nano /etc/nginx/sites-available/stats.domain.com | ||
| 65 | ``` | ||
| 66 | |||
| 67 | ```nginx | ||
| 68 | server { | ||
| 69 | root /var/www/html/stats.domain.com; | ||
| 70 | server_name stats.domain.com; | ||
| 71 | |||
| 72 | index index.html; | ||
| 73 | location / { | ||
| 74 | try_files $uri $uri/ =404; | ||
| 75 | } | ||
| 76 | } | ||
| 77 | ``` | ||
| 78 | |||
| 79 | Now we check if the configuration is ok. We can do this with `nginx -t`. If all is ok, we can restart Nginx with `service nginx restart`. | ||
| 80 | |||
| 81 | After all that you should add A record for this domain that points to IP of a droplet. | ||
| 82 | |||
| 83 | Before enabling SSL you should test if DNS records have propagated with `curl stats.domain.com`. | ||
| 84 | |||
| 85 | Now, it's time to provision TLS certificate. To achieve this, you execute command `certbot --nginx`. Follow the wizard and when you are asked about redirection always choose 2 (always redirect to HTTPS). | ||
| 86 | |||
| 87 | When this is done you can visit https://stats.domain.com and you should get 404 not found error which is correct. | ||
| 88 | |||
| 89 | |||
| 90 | ## Getting GoAccess ready | ||
| 91 | |||
| 92 | If you are using Debian like system GoAccess should be available in repository. Otherwise refer to the official website. | ||
| 93 | |||
| 94 | ```sh | ||
| 95 | apt install goaccess | ||
| 96 | ``` | ||
| 97 | |||
| 98 | To enable Geo location we also need one additiona thing. | ||
| 99 | |||
| 100 | ```sh | ||
| 101 | cd /var/www/html/stats.stats.com | ||
| 102 | wget https://github.com/P3TERX/GeoLite.mmdb/raw/download/GeoLite2-City.mmdb | ||
| 103 | ``` | ||
| 104 | |||
| 105 | Now we create a shell script that will be executed every 10 minutes. | ||
| 106 | |||
| 107 | ```sh | ||
| 108 | nano /var/www/html/stats.domain.com/generate-stats.sh | ||
| 109 | ``` | ||
| 110 | |||
| 111 | Contents of this file should look like this. | ||
| 112 | |||
| 113 | ```sh | ||
| 114 | #!/bin/sh | ||
| 115 | |||
| 116 | zcat -f /var/log/nginx/access.log* > /var/log/nginx/access-all.log | ||
| 117 | |||
| 118 | goaccess \ | ||
| 119 | --log-file=/var/log/nginx/access-all.log \ | ||
| 120 | --log-format=COMBINED \ | ||
| 121 | --exclude-ip=0.0.0.0 \ | ||
| 122 | --geoip-database=/var/www/html/stats.domain.com/GeoLite2-City.mmdb \ | ||
| 123 | --ignore-crawlers \ | ||
| 124 | --real-os \ | ||
| 125 | --output=/var/www/html/stats.domain.com/index.html | ||
| 126 | |||
| 127 | rm /var/log/nginx/access-all.log | ||
| 128 | ``` | ||
| 129 | |||
| 130 | Because after a while nginx creates multiple files with access logs we use [`zcat`](https://linux.die.net/man/1/zcat) to extract Gziped contents and create a file that has all the access logs. After this file is used we delete it. | ||
| 131 | |||
| 132 | If you want to exclude your home IP's result look at the `--exclude-ip` option in script and instead of `0.0.0.0` add your own home IP address. You can find your home IP by executing `curl ifconfig.me` from your local machine and NOT from the droplet. | ||
| 133 | |||
| 134 | Test the script by executing `sh /var/www/html/stats.domain.com/generate-stats.sh` and then checking `https://stats.domain.com`. If you can see stats instead of 404 than you are set. | ||
| 135 | |||
| 136 | It's time to add this script to cron with `cron -e`. | ||
| 137 | |||
| 138 | ```go | ||
| 139 | */10 * * * * sh /var/www/html/stats.domain.com/generate-stats.sh | ||
| 140 | ``` | ||
| 141 | |||
| 142 | ## Securing with Basic authentication | ||
| 143 | |||
| 144 | You probably don't want stats to be publicly available, so we should create a user and a password for Basic authentication. | ||
| 145 | |||
| 146 | First we create a password for a user `stats` with `htpasswd -c /etc/nginx/.htpasswd stats`. | ||
| 147 | |||
| 148 | Now we update config file with `nano /etc/nginx/sites-available/stats.domain.com`. You probably noticed that the file looks a bit different from before. This is because `certbot` added additional rules for SSL. | ||
| 149 | |||
| 150 | Your location portion the config file should now look like. You should add `auth_basic` and `auth_basic_user_file` lines to the file. | ||
| 151 | |||
| 152 | ```nginx | ||
| 153 | location / { | ||
| 154 | try_files $uri $uri/ =404; | ||
| 155 | auth_basic "Private Property"; | ||
| 156 | auth_basic_user_file /etc/nginx/.htpasswd; | ||
| 157 | } | ||
| 158 | ``` | ||
| 159 | |||
| 160 | Test if config is still ok with `nginx -t` and if it is you can restart Nginx with `service nginx restart`. | ||
| 161 | |||
| 162 | If you now visit `https://stats.domain.com` you should be prompted for username and password. If not, try reopening your browser. | ||
| 163 | |||
| 164 | That is all. You now have analytics for your server that gets refreshed every 10 minutes. | ||
