Write For Us

Unfamiliar Territory? LLM-based Road Rules Guide Simplifies Driving - DRIVE Labs Ep. 34

E-Commerce Solutions SEO Solutions Marketing Solutions
57 Views
Published
Adapting driving behavior to new environments, customs, and laws is a long-standing challenge in autonomous driving. LLaDA (Large Language Driving Assistant) is an #LLM network that makes it easier to navigate in unfamiliar places by providing real-time guidance on regional traffic rules in different languages, for both human drivers and #autonomousvehicles. LLaDA will be powered by NVIDIA DRIVE Thor, which harnesses the new #generativeAI capabilities of NVIDIA’s Blackwell GPU architecture.

00:00:12 - Introducing LLaDA
00:00:36 - Multimodal neural interfaces integrate diverse types of data
00:00:53 - LLaDA can rapidly adapt to local traffic rules and customs
00:01:30 - The NVIDIA Riva speech SDK can process different languages
00:01:58 - LLaDA can also be applied to AV motion planning
00:02:28 - Accelerated by NVIDIA DRIVE Thor, built on the NVIDIA Blackwell architecture
00:03:02 - Visit our GitHub page and check out the LLaDA paper at CVPR 2024

Project page: https://boyiliee.github.io/llada/
Paper: https://arxiv.org/abs/2402.05932
Watch the full series here: https://nvda.ws/3LsSgnH
Learn more about DRIVE Labs: https://nvda.ws/36r5c6t
Follow us on social:
Twitter: https://nvda.ws/3LRdkSs
LinkedIn: https://nvda.ws/3wI4kue
#NVIDIADRIVE
Category
Hardware
Tags
NVIDIA, drive labs, cvpr
Sign in or sign up to post comments.
Be the first to comment