Loading...
Search
Toggle navigation
JCM
Home
About Journal
Information for Authors
Editorial Board
Subscription
Editorial Office
Table of Content
15 May 2018, Volume 36 Issue 3
Previous Issue
Next Issue
AN AUGMENTED LAGRANGIAN TRUST REGION METHOD WITH A BIOBJECT STRATEGY
Caixia Kou, Zhongwen Chen, Yuhong Dai, Haifei Han
2018, 36(3): 331350. DOI:
10.4208/jcm.1705m20160820
Asbtract
(
293
)
PDF
References

Related Articles

Metrics
An augmented Lagrangian trust region method with a biobject strategy is proposed for solving nonlinear equality constrained optimization, which falls in between penaltytype methods and penaltyfree ones. At each iteration, a trial step is computed by minimizing a quadratic approximation model to the augmented Lagrangian function within a trust region. The model is a standard trust region subproblem for unconstrained optimization and hence can efficiently be solved by many existing methods. To choose the penalty parameter, an auxiliary trust region subproblem is introduced related to the constraint violation. It turns out that the penalty parameter need not be monotonically increasing and will not tend to infinity. A biobject strategy, which is related to the objective function and the measure of constraint violation, is utilized to decide whether the trial step will be accepted or not. Global convergence of the method is established under mild assumptions. Numerical experiments are made, which illustrate the efficiency of the algorithm on various difficult situations.
A TRUSTREGIONBASED ALTERNATING LEASTSQUARES ALGORITHM FOR TENSOR DECOMPOSITIONS
Fan Jiang, Deren Han, Xiaofei Zhang
2018, 36(3): 351373. DOI:
10.4208/jcm.1605m20160828
Asbtract
(
290
)
PDF
References

Related Articles

Metrics
Tensor canonical decomposition (shorted as CANDECOMP/PARAFAC or CP) decomposes a tensor as a sum of rankone tensors, which finds numerous applications in signal processing, hypergraph analysis, data analysis, etc. Alternating leastsquares (ALS) is one of the most popular numerical algorithms for solving it. While there have been lots of efforts for enhancing its efficiency, in general its convergence can not been guaranteed.
In this paper, we cooperate the ALS and the trustregion technique from optimization field to generate a trustregionbased alternating leastsquares (TRALS) method for CP. Under mild assumptions, we prove that the whole iterative sequence generated by TRALS converges to a stationary point of CP. This thus provides a reasonable way to alleviate the swamps, the notorious phenomena of ALS that slow down the speed of the algorithm. Moreover, the trust region itself, in contrast to the regularization alternating leastsquares (RALS) method, provides a selfadaptive way in choosing the parameter, which is essential for the efficiency of the algorithm. Our theoretical result is thus stronger than that of RALS in[26], which only proved the cluster point of the iterative sequence generated by RALS is a stationary point. In order to accelerate the new algorithm, we adopt an extrapolation scheme. We apply our algorithm to the amino acid fluorescence data decomposition from chemometrics, BCM decomposition and rank(
L
_{r}
,
L
_{r}
, 1) decomposition arising from signal processing, and compare it with ALS and RALS. The numerical results show that TRALS is superior to ALS and RALS, both from the number of iterations and CPU time perspectives.
SNIG PROPERTY OF MATRIX LOWRANK FACTORIZATION MODEL
Hong Wang, Xin Liu, Xiaojun Chen, Yaxiang Yuan
2018, 36(3): 374390. DOI:
10.4208/jcm.1707m20160796
Asbtract
(
209
)
PDF
References

Related Articles

Metrics
zRecently, the matrix factorization model attracts increasing attentions in handling largescale rank minimization problems, which is essentially a nonconvex minimization problem. Specifically, it is a quadratic least squares problem and consequently a quartic polynomial optimization problem. In this paper, we introduce a concept of the SNIG ("Secondorder Necessary optimality Implies Global optimality") condition which stands for the property that any secondorder stationary point of the matrix factorization model must be a global minimizer. Some scenarios under which the SNIG condition holds are presented. Furthermore, we illustrate by an example when the SNIG condition may fail.
ON DOUBLY POSITIVE SEMIDEFINITE PROGRAMMING RELAXATIONS
Taoran Fu, Dongdong Ge, Yinyu Ye
2018, 36(3): 391403. DOI:
10.4208/jcm.1708m20170130
Asbtract
(
325
)
PDF
References

Related Articles

Metrics
Recently, researchers have been interested in studying the semidefinite programming (SDP) relaxation model, where the matrix is both positive semidefinite and entrywise nonnegative, for quadratically constrained quadratic programming (QCQP). Comparing to the basic SDP relaxation, this doublypositive SDP model possesses additional
O
(
n
^{2}
) constraints, which makes the SDP solution complexity substantially higher than that for the basic model with
O
(
n
) constraints. In this paper, we prove that the doublypositive SDP model is equivalent to the basic one with a set of valid quadratic cuts. When QCQP is symmetric and homogeneous (which represents many classical combinatorial and nonconvex optimization problems), the doublypositive SDP model is equivalent to the basic SDP even without any valid cut. On the other hand, the doublypositive SDP model could help to tighten the bound up to 36%, but no more. Finally, we manage to extend some of the previous results to quartic models.
PARALLEL STOCHASTIC NEWTON METHOD
Mojmír Mutný, Peter Richtárik
2018, 36(3): 404425. DOI:
10.4208/jcm.1708m20170113
Asbtract
(
240
)
PDF
References

Related Articles

Metrics
We propose a parallel stochastic Newton method (PSN) for minimizing unconstrained smooth convex functions. We analyze the method in the strongly convex case, and give conditions under which acceleration can be expected when compared to its serial counterpart. We show how PSN can be applied to the large quadratic function minimization in general, and empirical risk minimization problems. We demonstrate the practical efficiency of the method through numerical experiments and models of simple matrix classes.
REDUCEDRANK MODELING FOR HIGHDIMENSIONAL MODELBASED CLUSTERING
Lei Yang, Junhui Wang, Shiqian Ma
2018, 36(3): 426440. DOI:
10.4208/jcm.1708m20160830
Asbtract
(
241
)
PDF
References

Related Articles

Metrics
Modelbased clustering is popularly used in statistical literature, which often models the data with a Gaussian mixture model. As a consequence, it requires estimation of a large amount of parameters, especially when the data dimension is relatively large. In this paper, reducedrank model and groupsparsity regularization are proposed to equip with the modelbased clustering, which substantially reduce the number of parameters and thus facilitate the highdimensional clustering and variable selection simultaneously. We propose an EM algorithm for this task, in which the Mstep is solved using alternating minimization. One of the alternating steps involves both nonsmooth function and nonconvex constraint, and thus we propose a linearized alternating direction method of multipliers (ADMM) for solving it. This leads to an efficient algorithm whose subproblems are all easy to solve. In addition, a model selection criterion based on the concept of clustering stability is developed for tuning the clustering model. The effectiveness of the proposed method is supported in a variety of simulated and real examples, as well as its asymptotic estimation and selection consistencies.
A COMPLETE CHARACTERIZATION OF THE ROBUST ISOLATED CALMNESS OF NUCLEAR NORM REGULARIZED CONVEX OPTIMIZATION PROBLEMS
Ying Cui, Defeng Sun
2018, 36(3): 441458. DOI:
10.4208/jcm.1709m20170034
Asbtract
(
205
)
PDF
References

Related Articles

Metrics
In this paper, we provide a complete characterization of the robust isolated calmness of the KarushKuhnTucker (KKT) solution mapping for convex constrained optimization problems regularized by the nuclear norm function. This study is motivated by the recent work in[8], where the authors show that under the Robinson constraint qualification at a local optimal solution, the KKT solution mapping for a wide class of conic programming problems is robustly isolated calm if and only if both the second order sufficient condition (SOSC) and the strict Robinson constraint qualification (SRCQ) are satisfied. Based on the variational properties of the nuclear norm function and its conjugate, we establish the equivalence between the primal/dual SOSC and the dual/primal SRCQ. The derived results lead to several equivalent characterizations of the robust isolated calmness of the KKT solution mapping and add insights to the existing literature on the stability of nuclear norm regularized convex optimization problems.
TRANSFORMATIONS FOR THE PRIZECOLLECTING STEINER TREE PROBLEM AND THE MAXIMUMWEIGHT CONNECTED SUBGRAPH PROBLEM TO SAP
Daniel Rehfeldt, Thorsten Koch
2018, 36(3): 459468. DOI:
10.4208/jcm.1709m20170002
Asbtract
(
236
)
PDF
References

Related Articles

Metrics
Transformations of Steiner tree problem variants have been frequently discussed in the literature. Besides allowing to easily transfer complexity results, they constitute a central pillar of exact stateoftheart solvers for wellknown variants such as the Steiner tree problem in graphs. In this article transformations for both the prizecollecting Steiner tree problem and the maximumweight connected subgraph problem to the Steiner arborescence problem are introduced for the first time. Furthermore, the considerable implications for practical solving approaches will be demonstrated, including the computation of strong upper and lower bounds.
Current Issue
Earlier Issues
Advanced Search
Most Read Articles
Most Downloaded Articles
Visit Hong Kong Site