After fixing a basis a tensor with two indices is associated with a matrix, so let us discuss matrices here. Let $A$ be some $n\times n$ matrix. The first thing I want to point out is that the kind of decomposition you are studying is always possible because it is a trivial identity:
$$A = \frac{A+A^T}{2} + \frac{A-A^T}{2} = \left(\frac{A+A^T}{2}- \frac{1}{n}\operatorname{Tr}(A)\mathbf{1}_n\right)+\frac{A-A^T}{2}+\frac{1}{n}\operatorname{Tr}(A)\mathbf{1}_n.$$
In the first equality we simply rewrite $A = \frac{A}{2}+\frac{A}{2}$ and then add and subtract $\frac{A^T}{2}$. In the second equality we add and subtract $\frac{1}{n}\operatorname{Tr}(A)\mathbf{1}_n$. So note this is always a true identity and you can check by simplifying the RHS which gives you $A$ back.
The first term is a symmetric matrix with zero trace, the second is a skew-symmetric matrix, and the third is something proportional to the identity matrix. Let $M_n(\mathbb{R})$ denote the space of $n\times n$ matrices with real entries. Then we have decomposed this vector space as
$$M_n(\mathbb{R}) = S_n(\mathbb{R})+A_n(\mathbb{R})+T_n(\mathbb{R}),$$
where $S_n(\mathbb{R})\subset M_n(\mathbb{R})$ is the subspace of symmetric matrices with zero trace, $A_n(\mathbb{R})\subset M_n(\mathbb{R})$ is the subspace of anti-symmetric matrices and $T_n(\mathbb{R})\subset M_n(\mathbb{R})$ is the subspace of matrices proportional to the identity. It is also very easy to see that $$S_n(\mathbb{R})\cap A_n(\mathbb{R})=\{0\},\quad S_n(\mathbb{R})\cap T_n(\mathbb{R})=\{0\},\quad A_n(\mathbb{R})\cap T_n(\mathbb{R})=\{0\}.$$
As a result the decomposition is really a direct sum decomposition:
$$M_n(\mathbb{R}) = S_n(\mathbb{R})\oplus A_n(\mathbb{R})\oplus T_n(\mathbb{R}).$$
This already shows a nice aspect of this decomposition. Recall that whenever a vector space $V$ decomposes as $V = V_1\oplus V_2$ we have that if $v = v_1+v_2 = v_1'+v_2'$ where $v_1,v_1'\in V_1$ and $v_2,v_2'\in V_2$, then $v_1=v_1'$ and $v_2=v_2'$.
This already shows you that if you have two matrices and you break them up like that, they are equal if and only if the terms in that decomposition are equal. So they are smaller, simpler building blocks of the matrix.
More than that we have the representation theory aspect. Let $R\in {\rm SO}(n)$, a 2-covariant tensor transforms like
$$A'_{k\ell} = R^i_{\phantom i k}R^j_{\phantom{j}\ell} A_{ij} = (R^T)_k^{\phantom k i}A_{ij} R^j_{\phantom j \ell} = (R^T A R)_{k\ell}.$$
As such, the transformation by $R$ can be summarized in index-free notation as $R^T A R$. This means that rotations act linearly on such matrices: meaning that the space carries a representation of the rotation group:
$$D(R)\cdot A = R^T A R.$$
A group representation is an assignment $g\to D(g)$ of a linear operator $D(g)\in {\rm GL}(V)$ acting on some vector space to each group element $g\in G$ such that $D(R_1R_2)=D(R_1)D(R_2)$ and $D(1)=1$. This means $D$ is a group homomorphism and therefore it realizes the group composition pattern as linear maps on $V$.
Now we have just seem that rotations act on these tensors. It is now straightforward to show that: if $A\in S_n(\mathbb{R})$ then $D(R)\cdot A\in S_n(\mathbb{R})$, if $A\in A_n(\mathbb{R})$ then $D(R)\cdot A\in A_n(\mathbb{R})$ and if $A\in T_n(\mathbb{R})$ then $D(R)\cdot A\in T_n(\mathbb{R})$. These are trivially true and you should check for yourself.
What this means is that the direct sum components are invariant subspaces. This means the representation is reducible, and when you break up the tensor like that you are actually picking up the irreducible parts. The irreducible parts are the simplest ones, since they can't be further broken down. As a result, it means that you are just focusing on the elementary building blocks.