5

I noticed that the Firefox browser plugin tells you the number of users connected through your snowflake instance and the number of users that connected within the last 24 hours.

I was wondering if the same stats can be extracted from the log trail of Snowflakes' docker image. I couldn't find any documentation to interpret those logs. Do I have any option other than reading the entire source code?

PS: There's no tag for snowflake. Can somebody create it?

7_R3X
  • 151
  • 3

4 Answers4

1

I wrote a script exactly for that purpose, see https://gist.github.com/Atrate/be4a7d308549c7a9fe281d2cdf578d21

0

Disclaimer: I am new to snowflake, so I'm guessing a bit.

Looking at the schema here:

enter image description here

And looking at my log output, a successful connection seems to look like this:

sdp offer successfully received.
Generating answer...
OnDataChannel
Connection successful.
OnOpen channel
connected to relay
[...]
OnClose channel
Traffic throughput (up|down): 110 KB|149 KB -- (512 OnMessages, 508 Sends, over 819 seconds)
copy loop ended
datachannelHandler ends

Skimming through the source code, I think that the interesting ones are:

  • The "sdp offer" message seems to be the offer relayed by the broker (number (2) in the picture).
  • "Generating answer" would be the answer going back to the broker (number (2) in the picture).
  • "Connection successful" and and "OnOpen channel" seem to mean that the WebRTC connection was initiated (number (4) in the picture).
  • "Connected to relay" sounds like (5) in the picture.
  • I guess that you are mostly interested in the Throughput summary, which seems to summarize what went through your snowflake instance for a particular client.

I don't know what the Firefox plugin uses exactly, but I could imagine that it counts the number of connections that had a non-zero throughput, i.e. the number of "Traffic throughput" lines in the logs.

0

I made a python script which can read in a Snowflake Log-File and give you a summary: https://gist.github.com/Allstreamer/0bc711d4d35ddb523df560425da76513

This is probably most usefull for people who use the compiled version not the docker version

0

I used the script from @allstreamer, enhanced it a little bit and added a docker-compose template, so it is easier to set everything up: Gist.

It has the following features:

  • Docker-compose is pre-configured to use the current directory to save the log file.
  • Read in the log file and give you statistics about number of connections and used download/upload bandwidth.
  • Configurable time windows (default: All time, Last 24h and Last Week).

Example output:

[All time ] Served 2978 People with ↑ 14.941 GB, ↓ 3.0149 GB

[Last 24h ] Served 406 People with ↑ 1.7402 GB, ↓ 0.4143 GB

[Last Week] Served 2473 People with ↑ 12.6182 GB, ↓ 2.5372 GB

Herget
  • 1