Instructions for the Project

List of Project Topics

  1. Image Processing
    1. Video denoising from Gaussian and impulse noise using low-rank matrix completion. See paper here. Low rank matrix completion is a very exciting discovery in the field of sparse representations. It turns out that given a low-rank matrix with missing entries, one can reconstruct the original matrix with high accuracy! This is called low-rank matrix completion. It has applications in image processing and machine learning (Recommender systems). You can try implementation one more application of the low rank matrix completion technique as well.

    2. Edge detection and background subtraction in low-light images and video: using the Skellam distribution

    3. Sparse orthonormal transforms for image compression - this builds upon the PCA technique we studied in CS 663 and which we will revise this semester. This project is for the mathematically inclined!

    4. Denoising and deblurring of images under Poisson-Gaussian noise: "A Convex Approach for Image Restoration with Exact Poisson-Gaussian Likelihood", by Chouzenoux at al, SIAM Journal on Imaging Sciences, 2015.

    5. Poisson plug and play: Poisson Inverse Problems by the Plug-and-Play scheme

    6. Using machine learning for distinguishing between photorealistic and actual photographic images (or live and re-broadcast images)

    7. Some forensics: duplicated regions in natural images

  2. Tomography
    1. sLLE: Spherical locally linear embedding with applications to tomography

    2. Denoising tomographic projections prior to reconstruction:Sparsity based denoising for tomography.
    3. Follow-up journal article on the same topic.
    4. Prior image constrained compressed sensing (PICCS): A method to accurately reconstruct dynamic CT images from highly undersampled projection data sets


  3. Compressed Sensing
    1. We have seen the relative merits and demerits of mutual coherence and the RIC in class. The following papers present two measures that are intermediate: the measures are computable efficiently and yet better than coherence (though not as tight as RIC):
      • Tang and Nehorai, Computable Performance Bounds on Sparse Recovery, IEEE Transactions on Signal Processing,
      • Gongguo Tang, Arye Nehorai, Performance Analysis of Sparse Recovery Based on Constrained Minimal Singular Values. IEEE Trans. Signal Processing

    2. Snapshot Compressed Sensing: Performance Bounds and Algorithms

    3. Compressed sensing matrix by optimizing on the mutual coherence: ON OPTIMIZATION OF THE MEASUREMENT MATRIX FOR COMPRESSIVE SENSING, and possibly also this paper. You may try to tweak the technique to design CS matrices (for example) of the Hitomi architecture.
    4. Inferring mismatch in image representations:
      • here. (In other words, you have seen sparse signal representations in the DCT basis. The frequencies of the cosine bases were aligned with a Cartesian grid. What if the signal was a sparse linear combination of cosine bases with frequencies that are slightly different from grid frequencies? Can you still model the signal well with a sparsity constraint? The paper proposes an alternating minimization algorithm for this.)
      • A. Fannjiang and H Tseng, Compressive Radar with off-grid targets, Inverse Problems,29,(2013)

    5. Inferring representation basis directly from compressive measurements:
      • A variant of the famous KSVD algorithm: Compressive KSVD. KSVD is a method which takes a bag of image patches as input, and returns a dictionary matrix, such that sparse linear combinations of the columns of this dictionary matrix are able to reconstruct the original patches with high accuracy. It also turns out that the columns of this matrix when reshaped to form a 2D array, resemble edge filters, if the learning is done properly. We will learn this method in class. This paper is about inferring such a dictionary directly when the original patches are not available, but instead only their compressive measurements are available.
      • Compressive sensing and PCA - a fascinating paper

    6. Faster implementation of orthogonal matching pursuit: here and here.

    7. How do you pick the appropriate number of measurements in compressed sensing, since the exact signal sparsity is typically unknown? How do you choose the regularization parameter is ISTA? The answer to both these questions lies in the concept of cross-validation. Cross-validation in compressed sensing is explored in the following papers:
    8. Classification, detection, source separation directly from compressive measurements (without reconstruction):
      • Davenport et al, "Signal Processing With Compressive Measurements", IEEE Journal of Selected Topics in Signal Processing
      • Davenport et al, "The smashed filter for compressive classification and target recognition" (see interesting experiments in "Compressive image acquisition and classification via secant projections")
      • See the issue of affine invariance in https://arxiv.org/pdf/1501.04367.pdf (Reconstruction-free action inference from compressive imagers)

    9. Gaussian mixture models for CS:
    10. Bahmani et al, "Greedy sparsity constrained optimization", Journal of Machine Learning Research 2013 (this is an extension of a greedy algorithm called CoSamp (similar to OMP) but for non-linear regression problems.

    11. Blind compressed sensing: Aghagolzadeh et al, "Joint estimation of dictionary and image from compressive samples", IEEE Transactions on Computational Imaging, 2017

    12. Compressed sensing when the sensing matrix is not accurately known: Yang, Zhang and Xie, "Robustly Stable Signal Recovery in Compressed Sensing With Structured Matrix Perturbation", IEEE TSP 2012

    13. A different compressed sensing technique: Enhancing Sparsity by Reweighted 1 Minimization
    14. Finding needles in compressed haystacks. This paper is about classification using a support vector machine in the compressed domain directly.