Framework
28.0k

JAX

Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more.

#Google#Accelerated Computing#NumPy#ML Research

Overview

JAX is Autograd and XLA, brought together for high-performance machine learning research. It provides a familiar NumPy-like API but with the power to run on accelerators like GPUs and TPUs.

Features

  • grad: Automatic differentiation of Python functions.
  • jit: Just-in-time compilation to XLA for maximum performance.
  • vmap: Automatic vectorization for batch processing.

Use Cases

  • High-performance physical simulations.
  • Modern deep learning research.
  • Composable numerical computing.