Please use this identifier to cite or link to this item:
Title: Markov processes in adaptive control.
Authors: Longley, D.
Award date: 1967
Presented at: University of Leicester
Abstract: The self-adaptive control system continually seeks to improve the performance of the process by making certain parameter adjustments. This improvement is achieved by "hill-climbing" techniques which test the effect of parameter adjustments on the performance and then make computed adjustments towards optimum conditions. The application of these techniques is complicated by the effects of the process dynamics and "noise" in performance index measurements. This thesis is concerned with the study of "hill-climbing" strategies that will minimise the effect of performance index noise. Markov process techniques were applied to a sempled data, unidimensional search over a parabolic curve and the effects of process dynamics were ignored. The result of incorporating a dead-zone filter in an optimising loop was analysed for a stepped-rectangular noise probability density function with a stationary and drifting optimum. A strategy which estimated the gradient by measuring the performance index difference over two successive values of the parameter (instead of perturbation techniques) was analysed but no conclusive results were obtained. Finally a supervisory loop, to adjust the gain of the optimising loop, was postulated and analysed. Digital and analogue computer studies were performed to compare the above strategies and to obtain an insight into the practical problems associated with the implementation of the optimising loop.
Level: Doctoral
Qualification: Ph.D.
Rights: Copyright © the author. All rights reserved.
Appears in Collections:Theses, Dept. of Computer Science
Leicester Theses

Files in This Item:
File Description SizeFormat 
U296327.pdf279.21 MBAdobe PDFView/Open

Items in LRA are protected by copyright, with all rights reserved, unless otherwise indicated.