CNTK v.2.1 Release Notes
This is a summary of changes and new features in the Microsoft Cognitive Toolkit V.2.1.
Highlights of this Release
- cuDNN 6.0 integration
- Support of Universal Windows Platform (UWP)
- Improvements in the CNTK Keras backend
- Performance improvements
- New manuals, tutorials and examples
- Bug fixes
Breaking changes
This release contains the following breaking changes:
- Updated ROI pooling to match Caffe implementation. The signature has the following updates:
- the parameters
pooling_type
andspatial_scale
were added, and - the coordinates for the parameters
rois
are now absolute to the original image size
- the parameters
- C++ API.
NDShape::Unknown
was changed toNDShape::Unknown()
to remove a static variable from the header.
New and updated features
- Preview of Reinforcement learning framework for CNTK. See details here. Please, be aware, that this is the first preview, and many changes are likely in the future
- More flexible Python-based user deserializer. See the manual for detailed information
- Caffe to CNTK model converter. See the Converter ReadMe. Examples illustrating the usage of the converter are available in CNTK Codebase.
- Reduction over multiple axes:
- Computes the mean of the input tensor's elements across the specified axis
- Computes the mean of the input tensor's elements across a specified axis or a list of specified axes
- Support of Universal Windows Platform (UWP) in CNTK Evaluation library (see section below)
- New and improved features in the CNTK Keras backend (see section below)
NVIDIA cuDNN 6.0 integration
GPU editions of CNTK Version 2.1 on Windows and Linux are shipped with the NVIDIA CUDA Deep Neural Network library (cuDNN) v.6.0. This improves CNTK performance with networks like ResNet 50 by about 10%.
If you build CNTK from source, you should also install NVIDIA cuDNN 6.0, since it is now the default for CNTK build and test on Windows and Linux.
CNTK Evaluation library. Support of Universal Windows Platform (UWP)
With this release we introduce the support of Universal Windows Platform (UWP). A new CNTK NuGet Package CNTK, UWP CPU-Only Build is available for download. For further details see the description of Model Evaluation on Universal Windows Platform. To build the CNTK UWP evaluation library, see the description here.
CNTK backend for Keras
Version 2.1 introduces the following improvements for the CNTK backend for Keras:
- Stateful recurrent network support
- Support of Recurrent layer with mask
- Fixing batch normalization layer issue
Performance improvements
- ResNet 50 performance improvement (reduce memcpy and memset during training). Expect single machine training speed to improve by about 8%
- Improvements in CNTK reader by index caching
- We have successfully tested CNTK ability to train ResNet 50 and Inception V3 with large minibatch size as the proposed in a recent paper. This confirms CNTK ability of minibatch scaling not only for speech, but for image tasks as well
New manuals, tutorials, examples and courses
Manuals
- How to train model using declarative and imperative API
- How to create user minibatch sources
- How to feed data
- Debugging CNTK programs
- How to use learners
- How to write a custom deserializer
Tutorials and Examples
- Training Acoustic Model with Connectionist Temporal Classification (CTC) Criteria
- Flapping Bird using Keras and Reinforcement Learning
- Faster R-CNN object detection
- Using CNTK V2 API in Azure WebAPI for model evaluation
- CVPR 2017 - July 26, 2017, Tutorial by Emad Barsoum, Sayan Pathak and Cha Zhang, Scalable Deep Learning with Microsoft Cognitive Toolkit. Presentation
Bug fixes
Version 2.1 provides fixes for the following bugs:
- Concurrency issue in parallel evaluations
- Intermittent reference error in
next_minibatch
- Checkpointing of
bptt
CNTK NuGet package
A new set of NuGet Packages (version 2.1.0) is provided with this release, including the new package CNTK, UWP CPU-Only Build (see section on Support of UWP above).
Acknowledgments
We thank the following community members for their contributions:
- @arturl
- @boeddeker
- @chivee
- @dadebarr
- @DGideas
- @frankibem
- @GeoffChurch
- @imriss
- @juice500ml
- @lakshayg
- @makrei
- @michhar
- @mikhail-barg
- @souptc
- @taehoonlee
- @vermorel
- @vmazalov
- @wsywl
- @yuxiaoguo
We apologize for any community contributions we might have overlooked in these release notes.