First of all, as far as I know, DeepStream only offers you the necessary code tools to perform inference in an optimized way on Nvidia hardware (GPU, jetson).
This means that it is not a training tool. For this you must use TLT (Transfer Learning Toolkit), which I currently understand is called TAO (Train, Adapt and Optimize).
To build DeepStream-based applications, you need the SDK. However, for deployment, the most recommended route is the Docker images offered by Nvidia at https://ngc.nvidia.com/catalog/containers. In the latter case, the SDK is not necessary since the image has everything necessary to run DeepStream applications.