摘要
In this paper, an algorithm for computing the Hilbert transform based on the Haar multiresolution approximation is proposed and the L-2-error is estimated. Experimental results show that it outperforms the library function 'hilbert' in Matlab (The MathWorks, Inc. 1994-2007). Finally it is applied to compute the instantaneous phase of signals approximately and is compared with three existing methods.