Workshop on Resource-Constrained Learning in Wireless Networks

Sixth Conference on Machine Learning and Systems (MLSys 2023)

Thursday, June 8th, 2023, Miami Convention Center, Miami, Florida, Room 246

Zoom Link for virtual access to the workshop


Recent years have seen an accelerated interest in applying artificial intelligence (AI) and, in particular, machine learning (ML) techniques to optimize, design, and automate communication networks. As the complexity of modern 5G and beyond networks grows, ML techniques promise new design methodologies and tools for architecting complex communication networks and automating network operation. At the same time, the increasingly powerful and ubiquitous sensing, computation and communication capabilities of emerging networks are driving ML computations closer to the edge, enabling local processing of distributed datasets; a shift that promises pervasive and contextualized ML solutions at scale.

ML techniques are rapidly evolving, moving from traditional supervised learning to self-supervised learning, reinforcement learning, and artificial general intelligence. However, they face several challenges for wide-scale deployment in wireless systems. Wireless communication networks should operate with almost no access to labeled data and must be particularly robust to dynamically-changing network environments that vary at extremely short timescales. Suitable ML solutions should thus (i) generalize well to new scenarios with minimal supervision and (ii) deliver effective performance for mobile and severely resource-constrained devices. Addressing these challenges requires effectively combining wireless domain knowledge with a deep understanding of ML methods towards developing high-performance and practical solutions and optimize the performance of existing and future communication systems.

Exploiting synergies between applying ML techniques to optimize communication networks and distributing ML workloads across resource-constrained and unreliable networks requires a cross-disciplinary virtuous co-design cycle that could transform both wireless networking and ML technologies. Modern communication networks need to support diverse and mission-critical services over time-varying wireless channels, mandating automated approaches that can operate online, reliably, and in real-time. At the same time, learning at the wireless edge must address several challenges, including device heterogeneity and resource constraints, privacy concerns, limited data, and partially-observable environments.

This workshop seeks to bring ML and wireless networking experts together to identify inter-disciplinary approaches to evolve ML algorithms for and over communication networks that operate under constrained resources, including time, labeling, and computational capacity constraints. The workshop will provide a unique opportunity to expose the MLSys community to the challenges and opportunities of integrating ML methods into resource-constrained communication networks. It will also highlight emerging trends in ML with limited resources and their implications for the design and operation of next-generation communication networks.



Speakers



Call for Papers


We are seeking original submissions in topics including, but not limited to:

  • Learning in wireless networks with limited training data
  • Multi-agent federated/distributed learning with low computational and communication resources
  • Communicated data compression for network-wide task completion
  • Online learning with wireless latency constraints
  • Learning in wireless networks with privacy constraints
  • Few-shot learning and adaptation in wireless environments
  • Datasets and benchmarks for resource-constrained learning in wireless networks

All submissions must be anonymous and should not include any information that violates the double-blind review process, including citing authors' prior work or sharing links in a way that can reveal the identities of authors to potential reviewers. After submission and during the review period, authors are allowed to post their papers as technical reports on arXiv or other public forums. Submitted papers should be in a 2-column format with 10-point font and can be up to 5 pages long, not including references and appendices. Authors may use as many pages of references and appendices as they wish, but reviewers are not required to read the appendices. Submissions that do not conform to these instructions may be desk-rejected at the Program Committee's discretion to ensure a fair review process for all potential authors.


Key Dates


  • Paper Submission Open: February 13, 2023.
  • Paper Submission Deadline: March 14 March 21 March 28, 2023, 11:59pm Eastern Time.
  • Acceptance Notification: May 8, 2023.
  • Camera Ready Deadline: May 22, 2023, 11:59pm Eastern Time.
  • Workshop presentations: June 8, 2023.

Program Committee


  • Ahan Kak
  • Anum Ali
  • Carlee Joe-Wong
  • Dinesh Bharadia
  • Ghosan Gadi
  • Le Liang
  • Mark Eisen
  • Mustafa Akdeniz
  • Rose Qingyang Hu
  • Santiago Segarra
  • Tianyi Zhou
  • Ervin Moore
  • Nasser Aldaghri
  • Basak Guler
  • Christos Louizos
  • Ehsan Aryafar
  • Jan Schreck
  • Marius Arvinte
  • Morteza Hashemi
  • Richard Dorrance
  • Sagar Dhakal
  • Songze Li
  • Yanzhi Wang

Program


You can watch the recroding of the meeting here.


Time Window (Eastern Time) Event Description
8:50 - 9:00 Opening Remarks (Ervin Moore)
9:00 - 9:30 Invited Talk (Hyeji Kim)
9:30 - 10:00 Invited Talk (Osvaldo Simeone)
10:00 - 10:30 Invited Talk (Alejandro Ribeiro)
10:30 - 10:45 Break
10:45 - 11:15 Invited Talk (Tara Javidi)
11:15 - 11:45 Invited Talk (Salman Avestimehr)
11:45 - 12:45 Lunch Break
12:45 - 13:00 Paper Presentation (DHA-FL: Enabling Efficient and Effective AIoT via Decentralized Hierarchical Asynchronous Federated Learning)
13:00 - 13:15 Paper Presentation (Scalable Feature Compression for Edge-Assisted Objection Detection Over Time-Varying Networks)
13:15 - 13:30 Break
13:30 - 14:00 Invited Talk (Olga Galinina)
14:00 - 14:15 Break
14:15 - 14:30 Paper Presentation (Trained-MPC: A Private Inference by Training-Based Multiparty Computation)
14:30 - 14:45 Paper Presentation (Efficient and Light-Weight Federated Learning via Asynchronous Distributed Dropout)
14:45 - 15:00 Paper Presentation (Over-the-Air Federated TD Learning)
15:00 - 15:05 Closing Remarks (Ervin Moore)

Event Host



Organizers