0

In his article word2vec Parameter Learning Explained Xin Rong uses the dot notation as an index for matrix (at the second page in the first formula).

$$ h = W^{T}x = W^{T}_{(k,\cdot)} := {v^{T}}_{w_I} \tag 1$$

If I understand the notation correctly, it means summarizing all the elements.

Am I correct in thinking that in this case this is just a formalism in the notation since as it stated in the paper:

$$ x_k=1 \ \text{and} \ x_{k'}=0 \ \text{for} \ k' \neq k $$

all other elements will be zero and no summation is necessary?

Is this just a notation to keep operations on matrices and vector consistent or I am missing something important?

I found similar question in the What does a dot mean in matrix element index?, but the answer seems to be specific for that case and the link they provide in comments doesn't open. So, in case somebody could refer to book/textbook with introducing this notation, please help.

Oliver Mason
  • 5,477
  • 14
  • 32

1 Answers1

1

The notation $W_{k, \cdot}$ likely refers to the $k$-th row of $W$. The dot just suggests taking all the elements in the row. It is just a convenient way to index an entire row of a matrix.

Similar usage of this indexing notation appears in 'Linear Algebra Done Right' (3rd edition) by Sheldon Axler in section 3.44.