Skip to content

Commit

Permalink
Changelog for 0.0.29 (fairinternal/xformers#1275)
Browse files Browse the repository at this point in the history
__original_commit__ = fairinternal/xformers@b6c5e6c
  • Loading branch information
danthe3rd authored and xFormers Bot committed Dec 26, 2024
1 parent 8fa35b5 commit 56be3b5
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 4 deletions.
12 changes: 9 additions & 3 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,17 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.0.28.post3] - TBD
### Fixed:
- Creating a `LowerTriangularMask` no longer creates a CUDA tensor
## [0.0.29] - 2024-12-27
### Improved:
- [fMHA] Creating a `LowerTriangularMask` no longer creates a CUDA tensor
- [fMHA] Updated Flash-Attention to `v2.7.2.post1`
- [fMHA] Flash-Attention v3 will now be used by `memory_efficient_attention` by default when available, unless the operator is enforced with the `op` keyword-argument. Switching from Flash2 to Flash3 can make transformer trainings ~10% faster end-to-end on H100s
- [fMHA] Fixed a performance regression with the `cutlass` backend for the backward pass (facebookresearch/xformers#1176) - mostly used on older GPUs (eg V100)
- Fixed swiglu operator compatibility with torch-compile with PyTorch 2.6
- Fix activation checkpointing of SwiGLU when AMP is enabled (facebookresearch/xformers#1152)
### Removed:
- Following PyTorch, xFormers no longer builds binaries for conda. Pip is now the only recommended way to get xFormers
- Removed unmaintained/deprecated components in `xformers.components.*` (see facebookresearch/xformers#848)

## [0.0.28.post3] - 2024-10-30
Pre-built binary wheels require PyTorch 2.5.1
Expand Down
2 changes: 1 addition & 1 deletion version.txt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.0.29
0.0.30

0 comments on commit 56be3b5

Please sign in to comment.