From fcbecca61c9f6ee14f0c31b7ef5196de47a009a0 Mon Sep 17 00:00:00 2001 From: honghanhh Date: Mon, 20 Oct 2025 23:24:41 +0200 Subject: [PATCH] fix(typo): correct typos and grammatical errors in course content --- units/en/unit0/1.mdx | 4 ++-- units/en/unit1/1.mdx | 1 - units/en/unit1/3.mdx | 2 +- units/en/unit2/5.mdx | 2 +- 4 files changed, 4 insertions(+), 5 deletions(-) diff --git a/units/en/unit0/1.mdx b/units/en/unit0/1.mdx index a041910..ae00518 100644 --- a/units/en/unit0/1.mdx +++ b/units/en/unit0/1.mdx @@ -6,7 +6,7 @@ This free course will take you on a journey, **from classical robotics to modern This course is based on the [Robot Learning Tutorial](https://huggingface.co/spaces/lerobot/robot-learning-tutorial), which is a comprehensive guide to robot learning for researchers and practitioners. Here, we are attempting to distill the tutorial into a more accessible format for the community. -This first unit will help you onboard. You’ll see the course syllabus and learning objectives, understand the structure and prerequisites, meet the team behind the course, learn about LeRobot and the surrounding Huggnig Face ecosystem, and explore the community resources that support your journey. +This first unit will help you onboard. You'll see the course syllabus and learning objectives, understand the structure and prerequisites, meet the team behind the course, learn about LeRobot and the surrounding Hugging Face ecosystem, and explore the community resources that support your journey. > [!TIP] > This course bridges theory and practice in Robotics! It's designed for students interested in understanding how machine learning is transforming robotics. Whether you're new to robotics or looking to understand learning-based approaches, this course will guide you step by step. @@ -97,7 +97,7 @@ We would like to extend our gratitude to the following projects and communities: Contributions are **welcome** 🤗 -* If you _found a bug or error_, please [open an issue](https://github.com/huggingface/robotic-course/issues/new) and **describe the problem**. +* If you _found a bug or error_, please [open an issue](https://github.com/huggingface/robotics-course/issues/new) and **describe the problem**. * If you _want to improve the course_, you can contribute to the robotics community through LeRobot development. * If you _want to add content or suggest improvements_, engage with the robotics community and share your ideas. diff --git a/units/en/unit1/1.mdx b/units/en/unit1/1.mdx index 0aab393..823102a 100644 --- a/units/en/unit1/1.mdx +++ b/units/en/unit1/1.mdx @@ -29,7 +29,6 @@ Today's robotics researchers are moving away from the traditional approach of wr - Extract useful information from many types of sensors (cameras, touch sensors, microphones) using data-driven methods. - Work effectively without needing perfect models of how the world behaves. - Take advantage of the growing number of open robotics datasets that anyone can access and learn from. -- Work effectively without needing perfect models of how the world behaves. You can watch [this video](https://www.youtube.com/watch?v=VEs1QYEgOQo) to get a better sense of the paradigm shift currently undergoing in robotics. diff --git a/units/en/unit1/3.mdx b/units/en/unit1/3.mdx index 1a5a84d..145a9eb 100644 --- a/units/en/unit1/3.mdx +++ b/units/en/unit1/3.mdx @@ -31,7 +31,7 @@ Think of it as two layers: a compact on‑disk layout for speed and scale, and a Datasets are always organized into three main components: - **Tabular Data**: Low-dimensional, high-frequency data such as joint states, and actions are stored in efficient memory-mapped files, and typically offloaded to the more mature `datasets` library by Hugging Face, providing fast with limited memory consumption. -- **Visual Data**: To handle large volumes of camera data, frames are concatenated and encoded into MP4 files. Frames from the same episode are always grouped together into the same video, and multiple videos are grouped together by camera. To reduce stress on the file system, groups of videos for the same camera view are also broke into multiple sub-directories. +- **Visual Data**: To handle large volumes of camera data, frames are concatenated and encoded into MP4 files. Frames from the same episode are always grouped together into the same video, and multiple videos are grouped together by camera. To reduce stress on the file system, groups of videos for the same camera view are also broken into multiple sub-directories. - **Metadata**: A collection of JSON files which describes the dataset's structure in terms of its metadata, serving as the relational counterpart to both the tabular and visual dimensions of data. Metadata include the different feature schema, frame rates, normalization statistics, and episode boundaries. As you browse a dataset on disk, keep these three buckets in mind—they explain almost everything you’ll see. diff --git a/units/en/unit2/5.mdx b/units/en/unit2/5.mdx index 47af0d5..08bb85a 100644 --- a/units/en/unit2/5.mdx +++ b/units/en/unit2/5.mdx @@ -15,7 +15,7 @@ First, a quick recap of the important concepts covered across the foundational u **LeRobot Ecosystem:** You've gained a fundamental understanding of LeRobot's approach to robotics. This includes understanding the vision behind LeRobot as an end-to-end robotics library, aiming at integrating the different aspects of robotics altogether. You have also learned about the LeRobotDataset format, which handles the complexity of multi-modal robotics data, and got practical experience with loading and processing real robotics datasets for machine learning applications. -Next, you will learn how to synthetize autonomous control behaviors directly from data, and deploy them on real-world robots using lerobot. +Next, you will learn how to synthesize autonomous control behaviors directly from data, and deploy them on real-world robots using lerobot. **Classical Robotics Foundations:** We've examined the traditional approaches to robotics in detail, covering different types of robot motion including manipulation, locomotion, and mobile manipulation. You've learned about forward and inverse kinematics, differential kinematics, and feedback control systems. Most importantly, you've developed an understanding of why classical approaches, despite their mathematical rigor, struggle with the complexity and variability of real-world robotic applications.