pollen

by josh
Home

❯

Readwise

❯

Podcasts

❯

FlashAttention 2 — Making Transformers 800% Faster WO Approximation With Tri Dao of Together AI

FlashAttention 2 — Making Transformers 800% Faster WO Approximation - With Tri Dao of Together AI

Sep 16, 20231 min read

  • 1min Snip Time 0:07:32

    2023-09-16

Cover
AuthorLatent Space: The AI Engineer Podcast — CodeGen, Agents, Computer Vision, Data Science, AI UX and al…
TypePodcast
Listen to episode(share.snipd.com)

Graph View

  • FlashAttention 2 — Making Transformers 800% Faster W/O Approximation - With Tri Dao of Together AI
  • Highlights

Created with Quartz v4.5.2 © 2026

  • about
  • so, what's enzyme?