Skip to content
GitLab
Menu
Projects
Groups
Snippets
Help
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Menu
Open sidebar
Shenguo Wang
Flash Attention
Commits
1a2c3e8c
Commit
1a2c3e8c
authored
1 year ago
by
Tri Dao
Browse files
Options
Download
Email Patches
Plain Diff
Bump to v2.4.2
parent
73df3be7
main
v2.5.9
v2.5.9.post1
v2.5.8
v2.5.7
v2.5.6
v2.5.5
v2.5.4
v2.5.3
v2.5.2
v2.5.1
v2.5.1.post1
v2.5.0
v2.4.3
v2.4.3.post1
v2.4.2
No related merge requests found
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
flash_attn/__init__.py
+1
-1
flash_attn/__init__.py
training/Dockerfile
+2
-2
training/Dockerfile
with
3 additions
and
3 deletions
+3
-3
flash_attn/__init__.py
View file @
1a2c3e8c
__version__
=
"2.4.
1
"
__version__
=
"2.4.
2
"
from
flash_attn.flash_attn_interface
import
(
flash_attn_func
,
...
...
This diff is collapsed.
Click to expand it.
training/Dockerfile
View file @
1a2c3e8c
...
...
@@ -85,11 +85,11 @@ RUN pip install transformers==4.25.1 datasets==2.8.0 pytorch-lightning==1.8.6 tr
RUN
pip
install
git+https://github.com/mlcommons/logging.git@2.1.0
# Install FlashAttention
RUN
pip
install
flash-attn
==
2.4.
1
RUN
pip
install
flash-attn
==
2.4.
2
# Install CUDA extensions for fused dense, layer norm
RUN
git clone https://github.com/HazyResearch/flash-attention
\
&&
cd
flash-attention
&&
git checkout v2.4.
1
\
&&
cd
flash-attention
&&
git checkout v2.4.
2
\
&&
cd
csrc/layer_norm
&&
pip
install
.
&&
cd
../../
\
&&
cd
csrc/fused_dense_lib
&&
pip
install
.
&&
cd
../../
\
&&
cd
..
&&
rm
-rf
flash-attention
This diff is collapsed.
Click to expand it.
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment
Menu
Projects
Groups
Snippets
Help