Category: Releases
-
DeepSpeed4Science Initiative Release
DeepSpeed4Science initiative is finally released! Check out our new system support for structural biology research, including addressing the memory explosion problem for EvoFormerAttention-centric protein structure prediction models, and our new Megatron-DeepSpeed framework release to enable extremely long sequences both systematically and algorithmically for domain scientists.
-
DeepSpeed4Science Enables Very-Long Sequence Support via both Systematic and Algorithmic Approaches for Genome-scale Foundation Models
New Megatron-DeepSpeed with Long Sequence Support: Code and Tutorial Model Partner: Argonne National Lab Introduction As shown in Figure 1, GenSLMs, a 2022 ACM Gordon Bell award winning genome-scale language model from Argonne National Lab, can learn the evolutionary landscape of SARS-CoV-2 (COVID-19) genomes by adapting large language models (LLMs) for genomic data. It is…
-
DS4Sci_EvoformerAttention: eliminating memory explosion problems for scaling Evoformer-centric structural biology models
DS4Sci_EvoformerAttention: Code and Tutorial Model partner: OpenFold team, Columbia University Introduction OpenFold is a community reproduction of DeepMind’s AlphaFold2 that makes it possible to train or finetune AlphaFold2 on new datasets. Researchers have used it to retrain AlphaFold2 from scratch to produce new sets of model parameters, studied the early training phase of AlphaFold2 (Figure…