If you have a smooth function, the data points are not (!) statistically independent. Instead, neighbouring data points are similar and therefore correlated. This complicates things, mathematically. Thus, one way to tackle this problem could be as follows:
- First calculate the correlation length of the dataset. Suppose you find that the correlation length is $r$.
- Now, only consider every $r^{th}$ datapoint. Thus use the simple formula
\begin{align}
y &= \frac{\sum_{i=1}^N w_i x_{i}}{\sum_{i=1}^N w_i}
= \frac{1}{N \bar{w}}\sum_{i=1}^N w_i x_{i}
~~~\textrm{where $w_i$ are the weights} \\
\Rightarrow \sigma_y^2 &\approx
\sum_{i=1}^N
\left(
\frac{\partial y}{\partial x_{i}}
\right)^2 \sigma_{x_{i}}^2
=
\frac{1}{(N \bar{w})^2}
\sum_{i=1}^N
w_i^2 \sigma_{x_{i}}^2
\end{align}
Mathematical details: I implicitly replaced the dataset $\{x_i\}$ by $x_{1+i\cdot r}$. Thus, I took only every $r^{th}$ data point.
The described reduction of the dataset is unsatisfactory, because we do not use the full information contained in the dataset. Therefore, you might be interested in using the mathematically more proper formula
$$
\sigma_y^2 \approx \sum_{i=1}^N
\left(
\frac{\partial y}{\partial x_{i}}
\right)^2 \sigma_{x_{i}}^2
+
2 \sum_{i = 1}^{N-1}
\sum_{j = i+1}^{N}
\frac{\partial y}{\partial x_{i}}
\frac{\partial y}{\partial x_{j}}
Cov[x_i, x_j]
$$
Here we included the second term of the Taylor expansion, where $Cov[x_i, x_j]$ denotes the covariance between $x_i$ and $x_j$.