Department of Engineering / Profiles / Dr Ramji Venkataramanan

Department of Engineering

Dr Ramji Venkataramanan


Ramji Venkataramanan

University Lecturer in Digital Communications and Signal Processing

Academic Division: Information Engineering

Research group: Signal Processing and Communications

Telephone: +44 1223 7 66767


Personal website


Research interests

My research interests are broadly in communications and information processing for networks. There are two main themes: 1) Designing low-complexity codes with near-optimal rates for important network communication models and 2) Understanding the fundamental limits of communication and compression in multi-terminal networks.

Below is a sample of projects I have worked on:

Optimal Source and Channel Coding via Sparse Linear Regression

Codes based on high-dimensional linear regression were recently introduced for communication over Gaussian channels. These codes achieve rates approaching the channel capacity with computationally efficient encoding and decoding. In recent work, we have used the sparse regression framework to design codes for data compression. These codes are the first to attain the optimal compression rate (the rate-distortion function) for Gaussian sources with low-complexity encoding and decoding algorithms. We have also shown that the source and channel codes constructed above can be combined to yield fast, rate-efficient codes for a variety of network problems.

Codes for Deletion and Insertion Models

The problem of synchronization from insertions and deletions is important in several applications, such as file sharing, online editing, and distributed storage. My work in this area includes designing low-complexity codes to efficiently correct synchronization errors as well as computing fundamental limits, i.e., bounds on the capacity of channels that introduce deletions and insertions.

Feedback & Feed-forward

Feedback is an important resource available in both wireless and wired networks, but exploiting it in a multi-terminal setting is not well-understood. Broadly speaking, feedback induces correlation between the distributed transmitters and receivers in the network. We have developed coding schemes that effectively leverage this correlation in multiple-access and broadcast settings. These schemes significantly improve on the best-known rates for these channels. In my Ph.D thesis, I also investigated the role of feed-forward in lossy data compression, for both point-to-point and multi-terminal models.

Codes for Data Storage

Many non-volatile memory technologies such as Phase Change Media and Flash use `rewrites' - while storing data, it is possible to write multiple times on a memory cell until a desirable output is obtained. Since rewrites consume extra power and shorten the lifetime of the memory, there is a basic trade-off: what is the storage capacity of the memory subject to a fixed rewrite budget? Further, how do we design codes that optimally exploit the rewrite option? In collaboration with researchers from the Memory Technologies group at IBM, we have developed efficient coding schemes for some basic re-writable channel models.

Teaching activity

  • 4F5: Advanced Communications and Coding
  • 1B Paper 6 (Communications)
  • Supervises 1B Paper 6 (Linear Systems, Signal and Data Analysis, Communications) for Trinity Hall students
  • Organises the biweekly Signal Processing and Communications Lab seminar

Other positions

Fellow of Trinity Hall


I am a University Lecturer (Assistant Professor) in Engineering at the University of Cambridge where I am also a Fellow of Trinity Hall. I am part of the Signal Processing and Communications Lab in the Division of Information Engineering. I received my Ph.D in EE(Systems) from the University of Michigan, Ann Arbor in 2008, and my undergraduate degree from the Indian Institute of Technology, Madras in 2002. Before joining Cambridge University in 2013, I held post-doctoral positions at Stanford University and Yale University.

My research interests are broadly in communications and information processing for networks and include network information theory, coding, statistical inference and learning.