首页 151 Efficient rank reduction of correlation matrices

151 Efficient rank reduction of correlation matrices

举报
开通vip

151 Efficient rank reduction of correlation matrices Linear Algebra and its Applications 422 (2007) 629–653 www.elsevier.com/locate/laa Efficient rank reduction of correlation matrices Igor Grubišic´ a,∗, Raoul Pietersz b a Mathematical Institute, University of Utrecht, P.O. Box 80010, 3508 TA Utrecht, The Ne...

151 Efficient rank reduction of correlation matrices
Linear Algebra and its Applications 422 (2007) 629–653 www.elsevier.com/locate/laa Efficient rank reduction of correlation matrices Igor Grubišic´ a,∗, Raoul Pietersz b a Mathematical Institute, University of Utrecht, P.O. Box 80010, 3508 TA Utrecht, The Netherlands b Product Development Group (HQ7011), ABN AMRO Bank, P.O. Box 283, 1000 EA Amsterdam, The Netherlands Received 20 January 2005; accepted 14 November 2006 Available online 1 February 2007 Submitted by N.J. Higham Abstract Geometric optimisation algorithms are developed that efficiently find the nearest low-rank correlation matrix. We show, in numerical tests, that our methods compare favourably with the existing methods in the literature. The connection with the Lagrange multiplier method is established, along with an identification of whether a local minimum is a global minimum. An additional benefit of the geometric approach is that any weighted norm can be applied. The problem of finding the nearest low-rank correlation matrix occurs as part of the calibration of multi-factor interest rate market models to correlation. © 2006 Elsevier Inc. All rights reserved. Keywords: Geometric optimisation; Correlation matrix; Rank; LIBOR market model 1. Introduction The problem of finding the nearest low-rank correlation matrix occurs in areas such as finance, chemistry, physics and image processing. The mathematical formulation of this problem is as follows. Let Sn denote the set of real symmetric n × n matrices and let C be a symmetric n × n matrix with unit diagonal. For X ∈ Sn we denote by X � 0 that X is positive semidefinite. Let the desired rank d ∈ {1, . . . , n} be given. The problem is then given by Find X ∈ Sn, to minimize 12‖C − X‖2, subject to rank(X) � d; Xii = 1, i = 1, . . . , n; X � 0. (1.1) ∗ Corresponding author. Tel.: +31 30 2531529. E-mail address: grubisic@math.uu.nl (I. Grubišic´). 0024-3795/$ - see front matter ( 2006 Elsevier Inc. All rights reserved. doi:10.1016/j.laa.2006.11.024 630 I. Grubišic´, R. Pietersz / Linear Algebra and its Applications 422 (2007) 629–653 Here ‖ · ‖ denotes a semi-norm on Sn. The most important instance is 1 2 ‖C − X‖2 = 1 2 ∑ i 1, since for d = 1 the constraint set consists of a finite number (2n−1) of points, each of the form X = yyT where y ∈ {−1, 1}n. For d = 1 Problem (1.2) with respect to the standard Frobenius norm ‖ · ‖F can be formulated as max y∈{−1,1}n tr(Cyy T) = max y∈{−1,1}n y TCy, (3.2) which is known as the max-cut problem – an NP hard problem. Laurent and Poljak [29] consider this problem in more detail. They propose a relaxation by exactly ignoring the rank = 1 constraint which leads to a semidefinite programming (SDP) problem over the convex set of all correlation matrices. The celebrated work of Goemans and Williamson [19] shows that optimisation over this convex set provides a good approximation of the max-cut problem. For related results see [30]. For other applications to SDP, e.g., of the decomposition X = YY T to SDP relaxations, see [6], and of Riemannian geometry to SDP, see [38]. 3.1. Basic idea The idea for solving Problem (3.1) is to parameterize the constraint set by a manifold, and subsequently utilize the recently developed algorithms for optimisation over manifolds, such as Newton’s algorithm and conjugate gradient algorithms. Such geometric optimisation has been developed by Smith [50]. In Section 3.2, the constraint set is equipped with a topology, and we make an identification with a certain quotient space. We argue that the constraint set as such is not a manifold. Subsequently, in Section 3.3, we define a larger manifold (named Cholesky manifold), of the same dimension as the rank-d manifold, that maps surjectively to the constraint set. We may apply geometric optimisation on the Cholesky manifold. The connection between minima on the Cholesky manifold and on the constraint set will be established. 3.2. Topological structure In this section, the set of n × n correlation matrices of rank d or less is equipped with the subspace topology from Sn. We subsequently establish a homeomorphism (i.e., a topological isomorphism) between the latter topological space and the quotient space of n products of the d − 1 sphere over the group of orthogonal transformations of Rd . Intuitively the correspondence is as follows: We can associate with an n × n correlation matrix of rank d a configuration of n points of unit length in Rd such that the inner product of points i and j is entry (i, j) of the I. Grubišic´, R. Pietersz / Linear Algebra and its Applications 422 (2007) 629–653 635 correlation matrix.3 Any orthogonal rotation of the configuration does not alter the associated correlation matrix. This idea is developed more rigorously below. Definition 3.1. The set of symmetric n × n correlation matrices of rank at most d is defined by Cn,d = {X ∈ Sn; diag(X) = In, rank(X) � d, X � 0}. Here In denotes the n × n identity matrix and diag denotes the map Rn×n → Rn×n, diag(X)ij = δijXij , where δij denotes the Kronecker delta. The set Cn,d is a subset of Sn. The latter space is equipped with the Frobenius norm ‖ · ‖F , which in turn defines a topology. We equip Cn,d with the subspace topology. In the following, the product of n unit spheres Sd−1 is denoted by Tn,d . Elements of Tn,d are denoted as a matrix Y ∈ Rn×d , with each row vector Yi of unit length. An obvious choice for the parametrization is the map s : Tn,d → Cn,d, Y �→ YY T. (3.3) This parametrization is however not injective. Denote by Od the group of orthogonal transfor- mations of Rd . Elements of Od are denoted by a d × d orthogonal matrix Q. Since the map s is invariant under an orthogonal transformation, i.e., s(YQ) = s(Y ) for all Q ∈ Od , it is bett
本文档为【151 Efficient rank reduction of correlation matrices】,请使用软件OFFICE或WPS软件打开。作品中的文字与图均可以修改和编辑, 图片更改请在作品中右键图片并更换,文字修改请直接点击文字进行修改,也可以新增和删除文档中的内容。
该文档来自用户分享,如有侵权行为请发邮件ishare@vip.sina.com联系网站客服,我们会及时删除。
[版权声明] 本站所有资料为用户分享产生,若发现您的权利被侵害,请联系客服邮件isharekefu@iask.cn,我们尽快处理。
本作品所展示的图片、画像、字体、音乐的版权可能需版权方额外授权,请谨慎使用。
网站提供的党政主题相关内容(国旗、国徽、党徽..)目的在于配合国家政策宣传,仅限个人学习分享使用,禁止用于任何广告和商用目的。
下载需要: 免费 已有0 人下载
最新资料
资料动态
专题动态
is_808331
暂无简介~
格式:pdf
大小:429KB
软件:PDF阅读器
页数:0
分类:
上传时间:2011-11-29
浏览量:12