Releases: keras-team/keras
Releases · keras-team/keras
Keras 3.6.0
Highlights
- New file editor utility:
keras.saving.KerasFileEditor
. Use it to inspect, diff, modify and resave Keras weights files. See basic workflow here. - New
keras.utils.Config
class for managing experiment config parameters.
BREAKING changes
- When using
keras.utils.get_file
, withextract=True
oruntar=True
, the return value will be the path of the extracted directory, rather than the path of the archive.
Other changes and additions
- Logging is now asynchronous in
fit()
,evaluate()
,predict()
. This enables 100% compact stacking oftrain_step
calls on accelerators (e.g. when running small models on TPU).- If you are using custom callbacks that rely on
on_batch_end
, this will disable async logging. You can force it back by addingself.async_safe = True
to your callbacks. Note that theTensorBoard
callback isn't considered async safe by default. Default callbacks like the progress bar are async safe.
- If you are using custom callbacks that rely on
- Added
keras.saving.KerasFileEditor
utility to inspect, diff, modify and resave Keras weights file. - Added
keras.utils.Config
class. It behaves like a dictionary, with a few nice features:- All entries are accessible and settable as attributes, in addition to dict-style (e.g.
config.foo = 2
orconfig["foo"]
are both valid) - You can easily serialize it to JSON via
config.to_json()
. - You can easily freeze it, preventing future changes, via
config.freeze()
.
- All entries are accessible and settable as attributes, in addition to dict-style (e.g.
- Added bitwise numpy ops:
bitwise_and
bitwise_invert
bitwise_left_shift
bitwise_not
bitwise_or
bitwise_right_shift
bitwise_xor
- Added math op
keras.ops.logdet
. - Added numpy op
keras.ops.trunc
. - Added
keras.ops.dot_product_attention
. - Added
keras.ops.histogram
. - Allow infinite
PyDataset
instances to use multithreading. - Added argument
verbose
inkeras.saving.ExportArchive.write_out()
method for exporting TF SavedModel. - Added
epsilon
argument inkeras.ops.normalize
. - Added
Model.get_state_tree()
method for retrieving a nested dict mapping variable paths to variable values (either as numpy arrays or backend tensors (default)). This is useful for rolling out custom JAX training loops. - Added image augmentation/preprocessing layers
keras.layers.AutoContrast
,keras.layers.Solarization
. - Added
keras.layers.Pipeline
class, to apply a sequence of layers to an input. This class is useful to build a preprocessing pipeline. Compared to aSequential
model,Pipeline
features a few important differences:- It's not a
Model
, just a plain layer. - When the layers in the pipeline are compatible with
tf.data
, the pipeline will also remaintf.data
compatible, independently of the backend you use.
- It's not a
New Contributors
- @alexhartl made their first contribution in #20125
- @Doch88 made their first contribution in #20156
- @edbosne made their first contribution in #20151
- @ghsanti made their first contribution in #20185
- @joehiggi1758 made their first contribution in #20223
- @AryazE made their first contribution in #20228
- @sanskarmodi8 made their first contribution in #20237
- @himalayo made their first contribution in #20262
- @nate2s made their first contribution in #20305
- @DavidLandup0 made their first contribution in #20316
Full Changelog: v3.5.0...v3.6.0
Keras 3.5.0
What's Changed
- Add integration with the Hugging Face Hub. You can now save models to Hugging Face Hub directly from
keras.Model.save()
and load.keras
models directly from Hugging Face Hub withkeras.saving.load_model()
. - Ensure compatibility with NumPy 2.0.
- Add
keras.optimizers.Lamb
optimizer. - Improve
keras.distribution
API support for very large models. - Add
keras.ops.associative_scan
op. - Add
keras.ops.searchsorted
op. - Add
keras.utils.PyDataset.on_epoch_begin()
method. - Add
data_format
argument tokeras.layers.ZeroPadding1D
layer. - Bug fixes and performance improvements.
Full Changelog: v3.4.1...v3.5.0
Keras 3.4.1
This is a minor bugfix release.
Keras 3.4.0
Highlights
- Add support for arbitrary, deeply nested input/output structures in Functional models (e.g. dicts of dicts of lists of inputs or outputs...)
- Add support for optional Functional inputs.
- Introduce
keras.dtype_policies.DTypePolicyMap
for easy configuration of dtype policies of nested sublayers of a subclassed layer/model. - New ops:
keras.ops.argpartition
keras.ops.scan
keras.ops.lstsq
keras.ops.switch
keras.ops.dtype
keras.ops.map
keras.ops.image.rgb_to_hsv
keras.ops.image.hsv_to_rgb
What's changed
- Add support for
float8
inference forDense
andEinsumDense
layers. - Add custom
name
argument in all Keras Applications models. - Add
axis
argument inkeras.losses.Dice
. - Enable
keras.utils.FeatureSpace
to be used in atf.data
pipeline even when the backend isn't TensorFlow. StringLookup
layer can now taketf.SparseTensor
as input.Metric.variables
is now recursive.- Add
training
argument toModel.compute_loss()
. - Add
dtype
argument to all losses. keras.utils.split_dataset
now supports nested structures in dataset.- Bugs fixes and performance improvements.
Full Changelog: v3.3.3...v3.4.0
Keras 3.3.3
This is a minor bugfix release.
Keras 3.3.2
This is a simple fix release that re-surfaces legacy Keras 2 APIs that aren't part of Keras package proper, but that are still featured in tf.keras
. No other content has changed.
Keras 3.3.1
This is a simple fix release that moves the legacy _tf_keras
API directory to the root of the Keras pip package. This is done in order to preserve import paths like from tensorflow.keras import layers
without making any changes to the TensorFlow API files.
No other content has changed.
Keras 3.3.0
What's Changed
- Introduce float8 training.
- Add LoRA to ConvND layers.
- Add
keras.ops.ctc_decode
for JAX and TensorFlow. - Add
keras.ops.vectorize
,keras.ops.select
. - Add
keras.ops.image.rgb_to_grayscale
. - Add
keras.losses.Tversky
loss. - Add full
bincount
anddigitize
sparse support. - Models and layers now return owned metrics recursively.
- Add pickling support for Keras models. Note that pickling is not recommended, prefer using Keras saving APIs.
- Bug fixes and performance improvements.
In addition, the codebase structure has evolved:
- All source files are now in
keras/src/
. - All API files are now in
keras/api/
. - The codebase structure stays unchanged when building the Keras pip package. This means you can
pip install
Keras directly from the GitHub sources.
New Contributors
- @kapoor1992 made their first contribution in #19484
- @IMvision12 made their first contribution in #19393
- @alanwilter made their first contribution in #19438
- @chococigar made their first contribution in #19323
- @LukeWood made their first contribution in #19555
- @AlexanderLavelle made their first contribution in #19575
Full Changelog: v3.2.1...v3.3.0
Keras 3.2.1
Keras 3.2.0
What changed
- Introduce QLoRA-like technique for LoRA fine-tuning of
Dense
andEinsumDense
layers (thereby any LLM) in int8 precision. - Extend
keras.ops.custom_gradient
support to PyTorch. - Add
keras.layers.JaxLayer
andkeras.layers.FlaxLayer
to wrap JAX/Flax modules as Keras layers. - Allow
save_model
&load_model
to accept a file-like object. - Add quantization support to the
Embedding
layer. - Make it possible to update metrics inside a custom
compute_loss
method with all backends. - Make it possible to access
self.losses
inside a customcompute_loss
method with the JAX backend. - Add
keras.losses.Dice
loss. - Add
keras.ops.correlate
. - Make it possible to use cuDNN LSTM & GRU with a mask with the TensorFlow backend.
- Better JAX support in
model.export()
: add support for aliases, finer control overjax2tf
options, and dynamic batch shapes. - Bug fixes and performance improvements.
New Contributors
- @abhaskumarsinha made their first contribution in #19302
- @qaqland made their first contribution in #19378
- @tvogel made their first contribution in #19310
- @lpizzinidev made their first contribution in #19409
- @Murhaf made their first contribution in #19444
Full Changelog: v3.1.1...v3.2.0