Github: https://github.com/dsai-2024/
email:[email protected]
Friday Presentations: 1PM CST
Zoom.us Meeting ID
xxx
Password: xxx
This repo was created to consolidate all the materials for the MVPS Data Science Summer Camp.
Contents:
Subscribe to this repos notifications to be up-to-date as materials will change during the camp week.
This is the "Daily Repo" and will include new materials from teachers and students.
If you find an interesting, short compeling video that complements or introduces these materials, then let me know.
In the meantime, this list includes materials we have just covered.
We integrated OpenAI's Whisper with Spot - 2 minutes
We integrated ChatGPT with our robots - 2 minutes
The Confusion Matrix in Machine Learning - 9 minutes
Active Learning. The Secret of Training Models Without Labels - 7 minutes
Introduction To Autoencoders In Machine Learning - 13 minutes
How Realistic Are Today’s Robots? - 17 minutes
Robot Deception: How Tech Companies Are Fooling Us - 11 minutes
How to learn to code FAST using ChatGPT (it's a game changer seriously) - 22 minutes
Google’s NEW Prompting Guide is Incredible - 10 minutes
3Blue1Brown :
-
But what is a convolution?- 23 minutes
-
But what is a neural network? | Chapter 1, Deep learning - 17 minutes
-
But what is a GPT? Visual intro to transformers | Chapter 5, Deep Learning - 27 minutes
-
Attention in transformers, visually explained | Chapter 6, Deep Learning - 26 minutes
StatQuest with Josh Starmer :
-
The Essential Main Ideas of Neural Networks - 17 minutes
-
Neural Networks Pt. 2: Backpropagation Main Ideas - 17 minutes
1hr Talk Intro to Large Language Models
Building a RAG application from scratch using Python, LangChain, and the OpenAI API
Train Llama-3 8B on Any Dataset on Free Google Colab
FREE & PRIVATE ChatGPT: Run LLMs locally on your laptop with Ollama!
PyTorch vs TensorFlow in 2024 - Make the Right Choice
KothaEd : Flower Classification Project in Python Deep Learning Neural Network Model Project in Python
Karpathy :
-
MiniGpt - A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training - karpathy/minGPT
-
Let's build GPT: from scratch, in code, spelled out - We build a Generatively Pretrained Transformer (GPT), following the paper "Attention is All You Need" and OpenAI's GPT-2 / GPT-3.
While every effort was made to include current materials please refer to the following for prior versions: