Double extractor induction motor: Variational calculation using the Hamilton-Jacobi-Bellman formalism
Published 2020-05-22
Keywords
- dynamic programmic,
- optimal control theory,
- cost function,
- subderivates
How to Cite
Copyright (c) 2020 Revista UIS Ingenierías
This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.
Abstract
This contribution presents optimal control over a double extractor induction motor using formalism through variational model. The criterion is subject to non-stationary equations of a reduced order (Dynamics equations of a reduced order model (DSIM)). As is well know, in this model the state variables are the rotor flow and motor speed in a circuit mechanical process. For non-stationary and stationary states, based on the theory of optimal control, this limit provides a high expensive function given as a weighted contribution of a DSIM theory. To order to acquire a lowest energy rotor flow path, the idea is to minimize the function to a dynamic of two equations of the motor speed and rotor flow. This problem is solved using with the Hamilton-Jacobi-Bellman equation and a time dependent solution for the rotor flow is determined analytically.
Downloads
References
[2] I. Kortas, A. Sakly, M. F. Mimouni, M. F. “Analytical solution of optimized energy consumption of Double Star Induction Motor operating in transient regime using a Hamilton–Jacobi–Bellman equation,” Energy, vol 89, pp. 55, 2015, doi: 10.1016/j.energy.2015.07.035
[3] J. H. Chang, B. K. Kim, “Minimum-time minimum-loss speed control of induction motors under field-oriented control,” IEEE transactions on industrial electronics, no. 6, vol. 44, pp. 809 1997.
[4] D. E. Kirk, Optimal control theory: an introduction. Mineola, NY, USA: Dover Publications, Inc., 2006.
[5] D. S. Naidu, Optimal control systems. Pocatello, ID, USA: CRC Press LLC., 2003
[6] K. Jasem, H. M. Jaddu, “The optimal control of a three-phase induction motor,” Proceedings of the 25th IASTED Conference, Lanzarote, Spain, 2006.
[7] A. Rubaai, O. Urbina, M. D. Kankam, “Design of an adaptive nonlinear controller-based optimal control theory for a voltage source induction motor drive system,” Conference Record of the 2001 IEEE Industry Applications Conference. 36th IAS Annual Meeting, vol. 2, 2001, pp. 1279.
[8] H. P. Geering, Optimal control with engineering applications. Springer, NY,USA, 2007.
[9] K. Okano, K. Hagino, “Adaptive control design approximating solution of Hamilton-Jacobi-Bellman equation for nonlinear strict-feedback system with uncertainties,” SICE Annual Conference, pp. 204. 2008.
[10] J. Seiffertt, S. Sanyal, D. C. Wunsch, “Hamilton–Jacobi–Bellman equations and approximate dynamic programming on time scales,” IEEE Transactions on Systems, Man, and Cybernetics, no 4, vol. 38, p. 918, 2008.
[11] J. Gough, V. A. Belavkin, O. G. Smolyanov, “Hamilton-Jacobi-Bellman equations for quantum optimal control,” Days on Diffraction, pp. 293297, 2006.
[12] S. Lyashevskiy, Y. Chen, “Optimization of continuous-time systems with constraints: controller design using the Hamilton-Jacobi-Bellman theory,” Proceedings of 35th IEEE Conference on Decision and Control, vol. 2, pp. 2006.