1 \section{Effect of Noise on the TCS Curve}
3 Taking the derivative of discrete data is problematic. Consider a function $f(x)$. Using a centred difference finite derivative approximation:
5 \der{f}{x} &= \frac{f(x + h) - f(x - h)}{h} + O(h^2)
7 The accuraracy of this approximation increases as $h \to 0$\footnote{Ignoring any effects due to rounding of floating point numbers}.
9 If $f_s(x)$ is the result of sampling $f(x)$, with $\Delta f$ the uncertainty in a measurement, then we can calculate the final uncertainty when finite differences are used to approximate $\der{f}{x}$:
11 f_s(x) &= f(x) \pm \Delta f \\
12 \der{f_s}{x} &\approx \der{f}{x} \\
13 &= \frac{f(x + h) - f(x - h)}{h} + O(h^2) \pm \frac{\Delta f}{h}
15 The uncertainty in the sampled derivative has a pole at $h = 0$.
18 {\emph Note: I now suspect that this is a major reason why Komolov has used Lock-in amplifiers}
21 Smoothing of the sampled points $f_s(x)$ (by application of a moving average) will reduce the deviation of points the smooth curve which best fits the data; We can think of the points $f_s(x)$ as sampling a \emph{different} function to $f(x)$, but with smaller uncertainties. Smoothing of the original sampled points removes fine structure.
23 The alternative is to increase $h$.
25 As shown in Figures \ref{siI.eps} and \ref{siI_tcs.eps}, smoothing of $f_s(x)$ has a far greater effect on the derivative of $f_s$ than on $f_s$ itself.
27 \emph{TODO: Calculate MSE for both curves}
28 \emph{TODO: Show curves created with large $h$}
33 \includegraphics[width=0.5\textwidth, angle=270]{figures/tcs/plots/siI.eps}
34 \caption{An unprocessed and smoothed I(E) curve for a Si sample.}
40 \includegraphics[width=0.5\textwidth, angle=270]{figures/tcs/plots/siI_tcs.eps}
41 \caption{The effect of smoothing the original I(E) curve on its derivative.}