Friday, June 11, 2010

Python, Econometrics, Time-Series: Burg algorithm for autoregressive modelling

Finally we got around to implement in Python the Burg algorithm. It was not that too hard, we simply translated code by Cedric Collomb in his tutorial "Burgs Method, Algorithm and Recursion and which you can also access in a previous blog post. In honor of his clear exposition with a matching C++ code, we name the Python function, CollombBurg. Basically, the algorithm will return m coefficients $$a_1, a_2, \ldots, a_{m}$$ such that \[ X_i = \sum_{k=1}^m a_{k} X_{i-k}\]. It is assumed that the maximum order, m, (or lag) to use is already known, otherwise the AIC criterion may be applied to find a good estimate.


"""
file              CollombBurg.py

author/translator Ernesto P. Adorio,Ph.D.
                  UPDEPP (UP Clark, Pampanga)
                  ernesto.adorio@gmail.com

Version           0.0.1 jun 11, 2010 # first release.

References        Burg's Method, Algorithm and Recursion, pp. 9-11
"""
from math import cos


def  burg(m,  x):
    """
    Based on Collomb's C++ code, pp. 10-11
    Burgs Method, algorithm and recursion
      m - number of lags in autoregressive model.
      x  - data vector to approximate.
    """
    N = len(x)-1
    coeffs = [0.0] * m
    
    # initialize Ak
    Ak      = [0.0]* (m+1)
    Ak[0] = 1.0 
    # initialize f and b.
    f  = x[:]
    b = x[:]
    # Initialize Dk
    Dk = 0.0
    for j in range(0,  N+1):
        Dk += 2.0 * f[j] * f[j]
    Dk -= f[ 0 ] * f[ 0 ] + b[ N ] * b[ N ];

    #Burg recursion
    for k in range(m):
        # compute mu
        mu = 0.0;
        for n in range(0,  N-k):
            mu += f[ n + k + 1 ] * b[ n ]
        mu *= -2.0 / Dk
        
        # update Ak
        maxn = (k+1)/2 + 1
        for n in range(maxn):
            t1 = Ak[ n ] + mu * Ak[ k + 1 - n ];
            t2 = Ak[ k + 1 - n ] + mu * Ak[ n ];
            Ak[ n ] = t1;
            Ak[ k + 1 - n ] = t2;
        #update f and b
        for n in range(N-k):
            t1 = f[ n + k + 1 ] + mu * b[n]
            t2 = b[ n ] + mu * f[ n + k + 1]
            f[ n + k + 1 ] = t1;
            b[ n ] = t2
            
        #update Dk
        Dk = ( 1.0 - mu * mu ) * Dk - f[ k + 1 ] * f[ k + 1 ] - b[ N - k - 1 ] * b[ N - k - 1 ];
    
    # assign coefficients.
    for i,  a in enumerate(Ak[1:]):
         coeffs[i] =a
    return coeffs
    
    

if __name__ == "__main__":
    # data to 
    original = [0.0]* 128
    for i in range(128):
        t  = cos( i * 0.01 ) + 0.75 *cos( i * 0.03 )\
            + 0.5 *cos( i * 0.05 ) + 0.25 *cos( i * 0.11 )
        original[i] = t
        print i,  t
        
    #get linear prediction coefficients.
    # using BurgAlgorithm( m, original )
    coeffs = burg(4,  original)
    
    # Linear Predict Data
    predicted = [orig for orig in original]
    m = len(coeffs)
    for i in range(m,  len(predicted)):
       predicted[i] =0.0
       for j in range(m):
           predicted[ i ] -= coeffs[ j ] * original[i-1-j]

    #Calculate and display error.
    error = 0.0
    for i in range(len(predicted)):
       print "Index: %2d / Original: %.6f / Predicted: %.6f" % (i, original[i], predicted[i] )
       delta = predicted[i] - original[i]
       error += delta * delta;
    print "Burg Approximation Error: %f\n" % error

When the above code runs, it printed the following:

9
Index:  0 / Original: 2.500000 / Predicted: 2.500000
Index:  1 / Original: 2.497477 / Predicted: 2.497477
Index:  2 / Original: 2.489927 / Predicted: 2.489927
Index:  3 / Original: 2.477411 / Predicted: 2.477411
Index:  4 / Original: 2.460028 / Predicted: 2.460028
Index:  5 / Original: 2.437916 / Predicted: 2.437917
Index:  6 / Original: 2.411250 / Predicted: 2.411250
Index:  7 / Original: 2.380238 / Predicted: 2.380239
Index:  8 / Original: 2.345123 / Predicted: 2.345125
Index:  9 / Original: 2.306177 / Predicted: 2.306178
Index: 10 / Original: 2.263697 / Predicted: 2.263698
Index: 11 / Original: 2.218005 / Predicted: 2.218007
Index: 12 / Original: 2.169443 / Predicted: 2.169445
Index: 13 / Original: 2.118368 / Predicted: 2.118371
Index: 14 / Original: 2.065152 / Predicted: 2.065155
Index: 15 / Original: 2.010171 / Predicted: 2.010174
Index: 16 / Original: 1.953808 / Predicted: 1.953811
Index: 17 / Original: 1.896445 / Predicted: 1.896449
Index: 18 / Original: 1.838460 / Predicted: 1.838465
Index: 19 / Original: 1.780224 / Predicted: 1.780229
Index: 20 / Original: 1.722094 / Predicted: 1.722099
Index: 21 / Original: 1.664412 / Predicted: 1.664417
Index: 22 / Original: 1.607501 / Predicted: 1.607506
Index: 23 / Original: 1.551661 / Predicted: 1.551666
Index: 24 / Original: 1.497167 / Predicted: 1.497172
Index: 25 / Original: 1.444265 / Predicted: 1.444270
Index: 26 / Original: 1.393171 / Predicted: 1.393176
Index: 27 / Original: 1.344070 / Predicted: 1.344075
Index: 28 / Original: 1.297110 / Predicted: 1.297115
Index: 29 / Original: 1.252408 / Predicted: 1.252413
Index: 30 / Original: 1.210043 / Predicted: 1.210047
Index: 31 / Original: 1.170058 / Predicted: 1.170062
Index: 32 / Original: 1.132462 / Predicted: 1.132465
Index: 33 / Original: 1.097229 / Predicted: 1.097232
Index: 34 / Original: 1.064298 / Predicted: 1.064301
Index: 35 / Original: 1.033578 / Predicted: 1.033580
Index: 36 / Original: 1.004946 / Predicted: 1.004947
Index: 37 / Original: 0.978251 / Predicted: 0.978251
Index: 38 / Original: 0.953317 / Predicted: 0.953317
Index: 39 / Original: 0.929947 / Predicted: 0.929946
Index: 40 / Original: 0.907923 / Predicted: 0.907920
Index: 41 / Original: 0.887010 / Predicted: 0.887007
Index: 42 / Original: 0.866964 / Predicted: 0.866960
Index: 43 / Original: 0.847530 / Predicted: 0.847525
Index: 44 / Original: 0.828449 / Predicted: 0.828443
Index: 45 / Original: 0.809461 / Predicted: 0.809453
Index: 46 / Original: 0.790308 / Predicted: 0.790300
Index: 47 / Original: 0.770742 / Predicted: 0.770732
Index: 48 / Original: 0.750521 / Predicted: 0.750511
Index: 49 / Original: 0.729420 / Predicted: 0.729409
Index: 50 / Original: 0.707231 / Predicted: 0.707219
Index: 51 / Original: 0.683766 / Predicted: 0.683754
Index: 52 / Original: 0.658862 / Predicted: 0.658848
Index: 53 / Original: 0.632378 / Predicted: 0.632364
Index: 54 / Original: 0.604207 / Predicted: 0.604192
Index: 55 / Original: 0.574266 / Predicted: 0.574251
Index: 56 / Original: 0.542509 / Predicted: 0.542494
Index: 57 / Original: 0.508920 / Predicted: 0.508904
Index: 58 / Original: 0.473515 / Predicted: 0.473499
Index: 59 / Original: 0.436345 / Predicted: 0.436329
Index: 60 / Original: 0.397496 / Predicted: 0.397479
Index: 61 / Original: 0.357083 / Predicted: 0.357066
Index: 62 / Original: 0.315255 / Predicted: 0.315238
Index: 63 / Original: 0.272189 / Predicted: 0.272173
Index: 64 / Original: 0.228093 / Predicted: 0.228076
Index: 65 / Original: 0.183198 / Predicted: 0.183182
Index: 66 / Original: 0.137759 / Predicted: 0.137744
Index: 67 / Original: 0.092053 / Predicted: 0.092038
Index: 68 / Original: 0.046373 / Predicted: 0.046358
Index: 69 / Original: 0.001024 / Predicted: 0.001009
Index: 70 / Original: -0.043677 / Predicted: -0.043691
Index: 71 / Original: -0.087407 / Predicted: -0.087420
Index: 72 / Original: -0.129840 / Predicted: -0.129853
Index: 73 / Original: -0.170654 / Predicted: -0.170666
Index: 74 / Original: -0.209529 / Predicted: -0.209541
Index: 75 / Original: -0.246158 / Predicted: -0.246169
Index: 76 / Original: -0.280245 / Predicted: -0.280255
Index: 77 / Original: -0.311511 / Predicted: -0.311520
Index: 78 / Original: -0.339699 / Predicted: -0.339708
Index: 79 / Original: -0.364576 / Predicted: -0.364584
Index: 80 / Original: -0.385934 / Predicted: -0.385941
Index: 81 / Original: -0.403595 / Predicted: -0.403602
Index: 82 / Original: -0.417416 / Predicted: -0.417422
Index: 83 / Original: -0.427284 / Predicted: -0.427290
Index: 84 / Original: -0.433126 / Predicted: -0.433130
Index: 85 / Original: -0.434902 / Predicted: -0.434906
Index: 86 / Original: -0.432613 / Predicted: -0.432617
Index: 87 / Original: -0.426297 / Predicted: -0.426300
Index: 88 / Original: -0.416031 / Predicted: -0.416033
Index: 89 / Original: -0.401928 / Predicted: -0.401930
Index: 90 / Original: -0.384140 / Predicted: -0.384142
Index: 91 / Original: -0.362853 / Predicted: -0.362855
Index: 92 / Original: -0.338289 / Predicted: -0.338290
Index: 93 / Original: -0.310697 / Predicted: -0.310699
Index: 94 / Original: -0.280360 / Predicted: -0.280362
Index: 95 / Original: -0.247584 / Predicted: -0.247585
Index: 96 / Original: -0.212698 / Predicted: -0.212700
Index: 97 / Original: -0.176052 / Predicted: -0.176054
Index: 98 / Original: -0.138010 / Predicted: -0.138012
Index: 99 / Original: -0.098950 / Predicted: -0.098952
Index: 100 / Original: -0.059255 / Predicted: -0.059257
Index: 101 / Original: -0.019313 / Predicted: -0.019316
Index: 102 / Original: 0.020487 / Predicted: 0.020484
Index: 103 / Original: 0.059762 / Predicted: 0.059759
Index: 104 / Original: 0.098138 / Predicted: 0.098135
Index: 105 / Original: 0.135254 / Predicted: 0.135251
Index: 106 / Original: 0.170764 / Predicted: 0.170761
Index: 107 / Original: 0.204344 / Predicted: 0.204340
Index: 108 / Original: 0.235691 / Predicted: 0.235688
Index: 109 / Original: 0.264532 / Predicted: 0.264529
Index: 110 / Original: 0.290623 / Predicted: 0.290619
Index: 111 / Original: 0.313749 / Predicted: 0.313745
Index: 112 / Original: 0.333734 / Predicted: 0.333730
Index: 113 / Original: 0.350433 / Predicted: 0.350430
Index: 114 / Original: 0.363743 / Predicted: 0.363740
Index: 115 / Original: 0.373596 / Predicted: 0.373593
Index: 116 / Original: 0.379964 / Predicted: 0.379961
Index: 117 / Original: 0.382856 / Predicted: 0.382853
Index: 118 / Original: 0.382321 / Predicted: 0.382318
Index: 119 / Original: 0.378444 / Predicted: 0.378441
Index: 120 / Original: 0.371345 / Predicted: 0.371343
Index: 121 / Original: 0.361180 / Predicted: 0.361179
Index: 122 / Original: 0.348136 / Predicted: 0.348135
Index: 123 / Original: 0.332429 / Predicted: 0.332428
Index: 124 / Original: 0.314301 / Predicted: 0.314301
Index: 125 / Original: 0.294019 / Predicted: 0.294020
Index: 126 / Original: 0.271870 / Predicted: 0.271871
Index: 127 / Original: 0.248155 / Predicted: 0.248157
Burg Approximation Error: 0.000000

Thanks to Cedrick Collomb.

2 comments:

  1. This Burg algorithm variant is due to Andersen (http://dx.doi.org/10.1109/PROC.1978.11160) and was detailed by Kay and Marple (http://dx.doi.org/10.1109/PROC.1981.12184) and implemented as sample code by Faber (http://www.citeulike.org/user/RhysU/article/11851228). Beware that it can be numerically unstable as Andersen suggests (https://github.com/RhysU/ar/issues/3).

    Collomb cites Hayes 2002 which presumably would better reference the source material, but all I can find is Hayes 1996 (ISBN 978-0471594314), including on Hayes' website (http://users.ece.gatech.edu/~mhh3/publications.html).

    ReplyDelete
  2. Sorry, I'd meant to link http://dx.doi.org/10.1109/PROC.1986.13584 for the Faber article.

    ReplyDelete