Note: In literature, the Seebeck coefficient is most often given to platinum as a reference point, because it is challenging to measure the absolute Seebeck coefficient. It is however possible, this for example is a paper doing it (for those without access, the gist is, that the absolute Seebeck coefficient of superconductors is 0, and from there on you integrate the Thomson coefficient).
In the Wikipedia article for the Seebeck coefficient, it is stated, that the sign of the Seebeck coefficient is determined by the sign of the predominant charge carriers (in materials with an abundance of free charge carriers). This would make sense, since the Seebeck-effect in such materials can be explained by a diffusion of free charge-carriers from hot to cold.
Now here lies my problem: While in metals the predominant free charge carriers are electrons (holes normally don't matter), only for some metals the Seebeck-coefficient is negative: For platinum the absolute Seebeck-coefficient is positive for low temperatures, and negative for higher ones (about $5 \mu V /K$ at around 60K, -5 $\mu$V /K at around 300K, -20 $\mu$V/K at 1200K, see this paper (same as in the first paragraph). Now, that are relative small variations around 0, but still. However we even see the same at room temperature: If we look at the list of Seebeck coefficients on the Wikipedia article we see Seebeck coefficients relative to platinum from 900 $\mu$V/K (selenium) to -72 $\mu$V/K (bismuth). That makes for very positive (and negative) absolute Seebeck coefficients (remember: platinum has around -5 $\mu$V/K at room temperature).
To be honest, I had a little of a hard time finding other sources than Wikipedia for this sign convention, however it is what directly follows from the theoretical explanation, and seems to be confirmed by the Mott formula in metals.
So, what is the solution here? Is Wikipedia wrong? Have I missed something in the papers?