Abstract
Karush–Kuhn–Tucker (KKT) optimality conditions are often checked for investigating whether a solution obtained by an optimization algorithm is a likely candidate for the optimum. In this study, we report that although the KKT conditions must all be satisfied at the optimal point, the extent of violation of KKT conditions at points arbitrarily close to the KKT point is not smooth, thereby making the KKT conditions difficult to use directly to evaluate the performance of an optimization algorithm. This happens due to the requirement of complimentary slackness condition associated with KKT optimality conditions. To overcome this difficulty, we define modified \({\epsilon}\)-KKT points by relaxing the complimentary slackness and equilibrium equations of KKT conditions and suggest a KKT-proximity measure, that is shown to reduce sequentially to zero as the iterates approach the KKT point. Besides the theoretical development defining the modified \({\epsilon}\)-KKT point, we present extensive computer simulations of the proposed methodology on a set of iterates obtained through an evolutionary optimization algorithm to illustrate the working of our proposed procedure on smooth and non-smooth problems. The results indicate that the proposed KKT-proximity measure can be used as a termination condition to optimization algorithms. As a by-product, the method helps to find Lagrange multipliers correspond to near-optimal solutions which can be of importance to practitioners. We also provide a comparison of our KKT-proximity measure with the stopping criterion used in popular commercial softwares.
Similar content being viewed by others
References
Andreani, R., Haeser, G., Martinez, J. M.: On sequential optimality conditions for smooth constrained optimization. Optim. Online (2009)
Andreani R., Martinez J. M., Svaiter B. F.: A new sequential optimality condition for constrained optimization and algorithmic consequences. SIAM J. Opt. 20, 3533–3554 (2010)
Byrd R. H., Hribar M. E., Nocedal J.: An interior point algorithm for large scale nonlinear programming. SIAM J. Optim. 9, 877–900 (1997)
Byrd R. H., Nocedal J., Waltz R. A.: KNITRO: An integrated package for nonlinear optimization, pp. 35–59. Springer, Berlin (2006)
Clarke, F. H.: Optimization and nonsmooth anslysis. Wiley-Interscience (1983)
Deb K.: An efficient constraint handling method for genetic algorithms. Comput. Methods Appl. Mech. Eng. 186(2–4), 311–338 (2000)
Deb, K., Tiwari, R., Dixit, M., Dutta, J.: Finding trade-off solutions close to KKT points using evolutionary multi-objective optimization. In: Proceedings of the Congress on Evolutionary Computation (CEC-2007), pp. 2109–2116 (2007)
Fletcher R., Reeves C. M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Hamel A.: An ε-lagrange multiplier rule for a mathematical programming problem on banach spaces. Optimization 49, 137–149 (2001)
Hock W., Schittkowski K.: Test Examples for Nonlinear Programming Codes. Springer, New York (1981)
Liang, J. J., Runarsson, T. P., Mezura-Montes E., Clerc M., Suganthan P. N., Coello Coello C. A., Deb, K.: Special session on constrained real-parameter optimization (http://www.ntu.edu.sg/home/epnsugan/) (2006)
Miettinen K.: Nonlinear Multiobjective Optimization. Kluwer, Boston (1999)
Moler, C.: The mathworks: MATLAB requirements (1985)
Nocedal J., Wright S. J.: Numerical Optimization. 2nd edn. Springer Series in Operations Research, Berlin (2006)
Rockafellar R. T.: Convex Analysis. Princeton University Press, Princeton (1996)
Watlz, R. A., Plantenga, T. D.: Knitro’s User Manual Version 6.0. Ziena Optimization (2009)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Dutta, J., Deb, K., Tulshyan, R. et al. Approximate KKT points and a proximity measure for termination. J Glob Optim 56, 1463–1499 (2013). https://doi.org/10.1007/s10898-012-9920-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10898-012-9920-5