IBM's edge solution enables developers to securely and autonomously deploy Deep Learning services on many Linux edge devices including GPU-enabled platforms such as the Jetson TX2. Leveraging JetPack 3.2's Docker support, developers can easily build, test, and deploy complex cognitive services with GPU access for vision and audio inference, analytics, and other deep learning services.
With IBM's edge solution beta, developers can register their custom service containers in IBM's Cloud repository, deploy and manage their services across multiple devices using Watson IoT Platform, collect insights from each edge, and analyze with IBM Watson and Cloud services.
Watch this webinar replay to learn how to:
- Register for IBM Watson IoT Platform services, including IBM's edge solution.
- Enable custom TX2 Deep Learning services and multi-service patterns, to be deployed over IBM's edge solution. (Demo: AI and Deep Learning inference on TX2, leveraging TensorFlow, OpenCV, Keras, TensorRT, Watson-Intu).
- Manage and collect insights from multiple Edge nodes using Watson IoT Platform, and display aggregate statistics.