KDD 2008 Workshop on Data Mining using Matrices and Tensors

Held in conjunction with
The 14th ACM SIGKDD International Conference on
Knowledge Discovery and Data Mining
(KDD 2008)

August 24-27, 2008, Las Vegas, USA

Scope and Program
Latest News
Workshop Program

Workshop Proceedings

Call for Papers (PDF) or (DOC)

Workshop Description

Topics of Interest

Important Dates

Paper Submissions




Program Committee

Relevant Links

KDD 2008
Stanford Workshop on Algorithms for Modern Massive Data Sets



Latest News    


[August 2nd, 2008] Workshop Proceedings is posted online.

[August 2nd, 2008] Preliminary Workshop Program is posted online. (PDF version and Word version of the program)

[June 23rd, 2008]  Confirmed invited speaker:  Prof.  Haesun Park, Georgia Institute of Technology

[June 22nd, 2008]  Confirmed invited speaker:  Prof. Christos Faloutsos, Carnegie Mellon University

[June 20th, 2008]   The paper notification has been sent to the contact authors

[June 18th, 2008]  Confirmed keynote speaker:  Prof. Michael I. Jordan,  University of California at Berkeley

[June 1st, 2008]     Confirmed invited speaker:  Prof.  Lenore R. Mullin,  Program Director, NSF CISE CCF Algorithmic Foundations Cluster

[April 29th, 2008] Workshop Call for Papers (PDF) or (DOC).

Workshop Description    


The field of pattern recognition, data mining and machine learning increasingly adapt methods and algorithms from advanced matrix computations, graph theory and optimization. Prominent examples are spectral clustering, non-negative matrix factorization, Principal component analysis (PCA) and Singular Value Decomposition (SVD) related clustering and dimension reduction, tensor analysis, L-1 regularization, etc. Compared to probabilistic and information theoretic approaches, matrix-based methods are fast, easy to understand and implement; they are especially suitable for parallel and distributed-memory computers to solve large scale challenging problems such as searching and extracting patterns from the entire Web. Hence the area of data mining using matrices and tensors is a popular and growing are of research activities.

This workshop will present recent advances in algorithms and methods using matrix and scientific computing/applied mathematics for modeling and analyzing massive, high-dimensional, and nonlinear-structured data. One main goal of the workshop is to bring together leading researchers on many topic areas (e.g., computer scientists, computational and applied mathematicians) to assess the state-of-the-art, share ideas, and form collaborations. We also wish to attract practitioners who seek novel ideas for applications. In summary, this workshop will strive to emphasize the following aspects:

  • Presenting recent advances in algorithms and methods using matrix and scientific computing/applied mathematics
  • Addressing the fundamental challenges in data mining using matrices and tensors
  • Identifying killer applications and key industry drivers (where theories and applications meet)
  • Fostering interactions among researchers (from different backgrounds) sharing the same interest to promote cross-fertilization of ideas.
  • Exploring benchmark data for better evaluation of the techniques

Topics Areas    Top


Topic areas for the workshop include (but are not limited to) the following:

Methods and algorithms:

  • Principal Component Analysis and Singular value decomposition for clustering and dimension reduction
  • Nonnegative matrix factorization for unsupervised and semi-supervised learning
  • Spectral graph clustering
  • L-1 Regularization and Sparsification
  • Sparse PCA and SVD
  • Randomized algorithms for matrix computation
  • Web search and ranking algorithms
  • Canonical Decompositions (CANDECOMP/PARAFAC)
  • Tensor analysis: Rank-1 Decomposition, PARAFAC/CANDECOMP, GLRAM/2DSVD,
    Tucker decompositions (e.g., the Higher-Order SVD)
  • GSVD for classification
  • Latent Semantic Indexing and other developments for Information Retrieval
  • Linear, quadratic and semi-definite Programming
  • Non-linear manifold learning and dimension reduction
  • Computational statistics involving matrix computations
  • Feature selection and extraction
  • Graph-based learning (classification, semi-supervised learning and unsupervised learning)

Application areas

  • Information search and extraction from Web
  • Text processing and information retrieval
  • Image processing and analysis
  • Genomics and Bioinformatics
  • Scientific computing and computational sciences
  • Social Networks

Important Dates    Top

  •  June 10, 2008: Electronic submission of full papers
  •  June 19, 2008 : Author notification
  •  June 27, 2008: Submission of Camera-ready papers
  •  August 24, 2008: Workshop in Las Vegas, USA

Paper Submissions    Top

The electronic submission web site for research papers is available at: http://www.easychair.org/conferences/?conf=dmmt08.

Please register at Easychair first if you did not use EasyChair before.

Papers should be at most 10 pages long, single-spaced, in KDD conference format, in font size 10 or larger with 1-inch margins on all sides.

Workshop Organizers    Top

General Chair

                    Hongyuan Zha,   College of Computing,  Georgia Institute of Technology

Workshop Co-Chairs:






Chris Ding

University of Texas at Arlington

Department of Computer Science and Engineering, 416 Yates Street, Arlington, TX 76019, USA



Tao Li

Florida International University

School of Computer Science, ECS 318 Miami, FL 33199, U.S.A.


(305) 348-6036

Shenghuo Zhu

NEC Laboratories America

10080 North Wolfe Road, Suite SW3-350, Cupertino, CA 95014, USA




Note: for inquiries please send e-mail to taoli AT cs.fiu.edu.

Program Committee Members    Top

  • Tammy Kolda, Sandia National Labs

  • Jesse Barlow, Penn State University

  • Michael Berry, University of Tennessee

  • Yun Chi, NEC Laboratories America

  • Lars Elden, Linkping University, Sweden

  • Christos Faloutsos, Carnegie Mellon University

  • Estratis Gallopoulos, University of Patras

  • Joydeep Ghosh, University of Texas at Austin

  • Ming Gu, University of Califonia, Berkeley

  • Michael Jordan, University of California, Berkeley

  • Yuanqing Lin, University of Pennsylvania

  • Huan Liu, Arizona State University

  • Michael Ng, Hong Kong Baptist University

  • Haesun Park, Georgia Tech

  • Wei Peng, Xerox Research

  • Robert Plemmons, Wake Forest

  • Alex Pothen, Old Domino University

  • Yousef Saad, University of Minnesota

  • Horst Simon, Lawrence Berkeley National Laboratory

  • Fei Wang, Tsinghua University

  • Jieping Ye, Arizona State University

  • Kai Yu, NEC Laboratories America

  • Hongyuan Zha, Georgia Tech

  • Zhongyuan Zhang, Chinese Academy of Science