Efficient Systems for Foundation Models

Workshop at the International Conference on Machine Learning (ICML) 2023.

➑️ ES-FoMo 2023 was a blast! Thank you for joining us at ICML in Hawaii, and stay tuned for a 2024 edition.

πŸ“ call for papers

We welcome submissions addressing emerging research questions and challenges associated with foundation model training and inference. Notably, we accept submissions concerning the entire spectrum of foundation models: from BERT-sized Transformers, to large models with 100B+ parameters.

Beyond β€œstandard” papers focusing on algorithms and evaluations, we encourage contributions which may be more engineering/systems/code-focused than what is commonly accepted at machine learning conferences, and which feature open codebases. Implementation details, performance benchmarks, and other nitty gritty details are welcome at this workshop.

Accepted works will be presented as posters during the workshop, and selected works will be featured as contributed talks.

πŸ“† important dates

  • Submission deadline: 2023/05/31 (submit on 🀝 OpenReview);
  • Acceptance notification: 2023/06/19;
  • Camera-ready deadline: 2023/07/07.

❓ relevant topics

  • Training and inference:
    • Training and inference of large models;
    • Training on cloud/heterogeneous infrastructure;
    • Training and inference in bandwith/compute/memory/energy-constrained settings;
    • Low-precision training and inference;
    • Decentralized/collaborative training and inference.
  • Algorithms for improved training and inference efficiency:
    • Model architectures for efficient inference and training;
    • Large batch training and optimization;
    • Compression algorithms for communication-efficient training.
  • Systems for foundation models:
    • System architectures for inference and training;
    • Programming languages and compilers for efficient training and inference;
    • Benchmarks for large models training and inference.

πŸ₯Έ additional details

  • Formatting: all submissions must be in PDF format, following the ICML template. Although this will not be enforced as a hard limit, we recommend submissions to be concise, up to four pages of main content, with unlimited additional pages for references and supplementary. Supporting code may be provided either as a supplementary .zip or as a link to an anonymized repository.
  • Reviewing: reviewing will be double-blind, so authors should take care to adequately anonymize their submission, especially regarding supplementary code. Reviewers will be instructed to focus on correctness and relevance to the workshop.
  • Non-archival: this workshop is non-archival and will not result in proceedings; workshop submissions can be submitted to other venues.
  • Dual submission: we welcome papers that may have been already accepted at ICML 2023 but which are also relevant to this workshop.
  • Contact: reach out to esfomo.workshop@gmail.com.