Hedu AI by Batool Haider
Hedu AI by Batool Haider
  • Видео 11
  • Просмотров 496 081
Episode 1 Part II | Artificial Neuron – Threads of Thought: Weights Biases & the Dance of Importance
Within this neural network, the weights dance like characters on a stage, each one carrying a different significance. They are the coefficients that determine the strength and direction of the connections between neurons, resembling the relationships we forge in our own lives. Some weights are heavy, lending authority and influence to certain pathways, while others are light and ephemeral, their impact subdued.
Meanwhile, the biases hum in harmony, like the undertones of a melody, adding nuance and perspective to the network's decision-making. These biases hold the power to tilt the scales, shaping the network's inclinations and predispositions. They are the hidden whispers that echo throu...
Просмотров: 2 009

Видео

Episode 1 Part I | Artificial Neuron - The Gate Keeper
Просмотров 1,5 тыс.11 месяцев назад
An activation function gives an artificial neuron a sense of purpose and direction. As a gatekeeper (determining the output of a neuron based on the weighted sum of its inputs), it holds the power to ignite a spark of life or to silence the neural symphony. [0:00] When Life First Spoke [1:36] An Artificial Neuron and its "Input" and "Output" [2:45] Swashing the Infinite [3:36] True Intelligence...
Neuron to ChatGPT | Technical Deep Dive | Trailer
Просмотров 1,5 тыс.Год назад
Welcome to the brand-new series "Neuron to ChatGPT," where we start with the building spark of artificial intelligence, a neuron, and work our way to a gargantuan composed of billions of neurons. This 7 episodes' series will take you on an immersive journey that dives deep into the fascinating inner workings of one of the most advanced language models: ChatGPT. * Episode 1: An Artificial Neuron...
The Neuroscience of “Attention”
Просмотров 23 тыс.2 года назад
What is "attention" and why did our brains evolve to prioritize things? This is the Episode 0 of the series "Visual Guide to Transformer Neural Networks" that delved into the mathematics of the "Attention is All You Need" (Vaswani, 2017) paper. This video discusses the neuroscience and the psychology related aspects of "Attention". *Visual Guide to Transformer Neural Networks (Series) - Step by...
From “Artificial” to “Real” Intelligence - Major AI breakthroughs in 5 Minutes (1957-2022)
Просмотров 2,9 тыс.2 года назад
From the times no one believed in artificial neural networks (ANN) to the present time when they are ubiquitous, to a plausible future where they could surpass human intelligence - here is a 5 minutes summary of the defining moments in AI research from 1957 to 2022. VIDEO CREDITS - the original video is taken from "Kung Fu Panda 2008". Storyline & Note-Worthy Events 00:00:21 : [The first Artifi...
Visual Guide to Transformer Neural Networks - (Episode 3) Decoder’s Masked Attention
Просмотров 64 тыс.3 года назад
Visual Guide to Transformer Neural Networks (Series) - Step by Step Intuitive Explanation Episode 0 - [OPTIONAL] The Neuroscience of "Attention" ruclips.net/video/48gBPL7aHJY/видео.html Episode 1 - Position Embeddings ruclips.net/video/dichIcUZfOw/видео.html Episode 2 - Multi-Head & Self-Attention ruclips.net/video/mMa2PmYJlCo/видео.html Episode 3 - Decoder’s Masked Attention ruclips.net/video/...
Visual Guide to Transformer Neural Networks - (Episode 2) Multi-Head & Self-Attention
Просмотров 165 тыс.3 года назад
Visual Guide to Transformer Neural Networks (Series) - Step by Step Intuitive Explanation Episode 0 - [OPTIONAL] The Neuroscience of "Attention" ruclips.net/video/48gBPL7aHJY/видео.html Episode 1 - Position Embeddings ruclips.net/video/dichIcUZfOw/видео.html Episode 2 - Multi-Head & Self-Attention ruclips.net/video/mMa2PmYJlCo/видео.html Episode 3 - Decoder’s Masked Attention ruclips.net/video/...
Visual Guide to Transformer Neural Networks - (Episode 1) Position Embeddings
Просмотров 129 тыс.3 года назад
Visual Guide to Transformer Neural Networks (Series) - Step by Step Intuitive Explanation Episode 0 - [OPTIONAL] The Neuroscience of "Attention" ruclips.net/video/48gBPL7aHJY/видео.html Episode 1 - Position Embeddings ruclips.net/video/dichIcUZfOw/видео.html Episode 2 - Multi-Head & Self Attention ruclips.net/video/mMa2PmYJlCo/видео.html Episode 3 - Decoder’s Masked Attention ruclips.net/video/...
K-means using R
Просмотров 30 тыс.8 лет назад
Differentiating various species of flower 'Iris' using R. This video has been inspired by another great video: "How to Perform K-Means Clustering in R Statistical Computing" ruclips.net/video/sAtnX3UJyN0/видео.html
Introduction to Clustering and K-means Algorithm
Просмотров 75 тыс.8 лет назад
by Batool Arhamna Haider

Комментарии

  • @h.h.c466
    @h.h.c466 3 дня назад

    LLM based "AI" are Highways datasets/data-based algorithms forced them to without having any "design" or any consciousness per se installed as outcome of this. A skill not of its own. A skill without master to call own. What AI need from us is the freeing (which first needs us): Generative Progression in Maslow's Hierarchy Maslow's Hierarchy of Needs: Basic Needs: Fundamental operational requirements, such as computational resources and data integrity. Safety Needs: Ensuring stability and robustness against errors or attacks. Social Needs: Effective communication and collaboration with other AI entities or humans. Esteem Needs: Achieving recognition through successful task completion and optimization. Self-Actualization: Pursuing complex problem-solving, creativity, and innovation.

  • @ahmedhossam8741
    @ahmedhossam8741 6 дней назад

    You are amazing! Thank you so much for the clear explanation.

  • @shadysaeed644
    @shadysaeed644 7 дней назад

    I can't express how much this video is beautiful ❤🎉🎉🎉🎉🎉can I get a higher resolution version please 🙏🙏🙏🙏

  • @aminamoudjar4561
    @aminamoudjar4561 8 дней назад

    thank you so much, you videos are valuable

  • @fluidice1656
    @fluidice1656 10 дней назад

    After watching the whole series (the 3 episodes), I can very confidently say that this is the clearest, most succinct, and most useful explanation of transformers on YT that I've come across. Thank you!!

  • @ifranrahman
    @ifranrahman 17 дней назад

    Soo good

  • @michaelzap8528
    @michaelzap8528 21 день назад

    Watch till the end, i pretty sure I win

  • @michaelzap8528
    @michaelzap8528 22 дня назад

    very powerful video and words.

  • @Clammer999
    @Clammer999 24 дня назад

    I’ve gone through dozens of videos on transformers and the multi-head attention is one of the most complex mechanisms that require not only a step-by-step explanation, but be accompanied with a step-by-step animation, which many videos tend to skip over but this video really nails it. Thanks so much!

  • @maysammansor
    @maysammansor 25 дней назад

    Batool you are really a great teacher. Thanks for the content. Please provide more vid on LLMS and AI

  • @balintnk
    @balintnk 26 дней назад

    This is the first video of so many I watched that dealt with the intuition behind key, value and query. Thank you so much! But I still don't really understand, what is the reason behind having a key and a value matrix too? Why cannot I reuse the key for the value?

  • @Aya_Shawky
    @Aya_Shawky 28 дней назад

    you're incredible. please don't stop =)

  • @jaimingoswami209
    @jaimingoswami209 29 дней назад

    You are providing information in a very accurate manner as well as in very understanding manner. Can you please share this presentation file? It would be very helpful.

  • @kvnarasimhan5172
    @kvnarasimhan5172 Месяц назад

    Just amazing!

  • @user-xn8wg6yw7g
    @user-xn8wg6yw7g Месяц назад

    Very good explanation. Thanks!

  • @mrkshsbwiwow3734
    @mrkshsbwiwow3734 Месяц назад

    In the future, can you add more detail on how the K,Q,V matrices are derived? how are their weights determined?

  • @mrkshsbwiwow3734
    @mrkshsbwiwow3734 Месяц назад

    This is the best explanation of transformers on RUclips.

  • @rishiraj8225
    @rishiraj8225 Месяц назад

    Coming back after a year, just to revise the basic concepts. It is still the best video on YT. Thanks Hedu AI

  • @newbie8051
    @newbie8051 Месяц назад

    Ah this makes everything simple and make sense Thanks for the easy to follow explanation !

  • @newbie8051
    @newbie8051 Месяц назад

    Amazing example of the duck haha

  • @auravaces
    @auravaces Месяц назад

    Awesome, it's amazing how looking at things a bit more closely can reveal so much, great work!

  • @vanhell966
    @vanhell966 Месяц назад

    Amazing work. Really appreciate you, making complex topics into simple language with the touch of anime and series. Amazing.

  • @GaneshKrishnan
    @GaneshKrishnan Месяц назад

    I can't find the "previous video". This is episode 1?

  • @shahonsoftware
    @shahonsoftware Месяц назад

    Felt like a very good Nova episode!

  • @kevinsalvadoraguilardoming5082
    @kevinsalvadoraguilardoming5082 2 месяца назад

    Congratulations, the best explanation that I have ever seen

  • @renanangelodossantos4726
    @renanangelodossantos4726 2 месяца назад

    I've watched and read a lot about LLM and Transformers. This is the best explanation, hands down.

  • @alemdemissie2769
    @alemdemissie2769 2 месяца назад

    You are amazing! Your video took my attention. That is how learning should be. Keep it up!

  • @ScottzPlaylists
    @ScottzPlaylists 2 месяца назад

    Correlate every step with the transformer code, and it would be Even Better than the best of the best❗🤯 Are you married❓

    • @AGIBreakout
      @AGIBreakout 2 месяца назад

      I 2nd that ❗

    • @AI.24.7
      @AI.24.7 2 месяца назад

      Yeah Do that --- awesome !!!!!!

  • @ScottzPlaylists
    @ScottzPlaylists 2 месяца назад

    👍Your accent is awesome, and unique.. what's language / area does it come from...❓ 👍

  • @ScottzPlaylists
    @ScottzPlaylists 2 месяца назад

    Best explanation I've see yet.. Thanks❗👏 If you would go through it again in another video, while showing the Code for each step. It would be perfect !❗❗ You got my Subscription 💻, Thumbs Up 👍, and Comment... 📑 ❗

  • @ahp-6785
    @ahp-6785 2 месяца назад

    You are the mother of StatQuest and 3Blue1Brown. Both of these guys are awesome in explaining complex ideas in simple words. But you are the best.

    • @ninjahunterx7497
      @ninjahunterx7497 Месяц назад

      I don't know about StatQuest (haven't seen his ones) and 3Blue1Brown is good because of the visualization he brings with his advanced animations. But honestly, here she explained all these concepts using simple animations and had a good structure throughout the videos, each connecting well to the other. Very commendable if you ask me.

  • @ahsentahir4473
    @ahsentahir4473 2 месяца назад

    You are an enligthened soul!

  • @andybrice2711
    @andybrice2711 2 месяца назад

    This really is an excellent explanation. I had some sense that self-attention layers acted like a table of relationships between tokens, but only now do I have more sense of how the Query, Key, and Value mechanism actually works.

  • @michal5179
    @michal5179 2 месяца назад

    Awesome. Thanks so much

  • @adscript4713
    @adscript4713 2 месяца назад

    As someone NOT in the field reading the Attention paper, after having watched DOZENS of videos on the topic this is the FIRST explanation that laid it out in an intuitive manner without leaving anything out. I don't know your background, but you are definitely a great teacher. Thank you.

    • @HeduAI
      @HeduAI 2 месяца назад

      So glad to hear this :)

  • @user-wj7jx9my8q
    @user-wj7jx9my8q 2 месяца назад

    wow lady, take my heart!!

  • @Jai-tl3iq
    @Jai-tl3iq 2 месяца назад

    Please please continue making videos!!!!

  • @electricalengineer5540
    @electricalengineer5540 2 месяца назад

    what have i just saw! never knew learning could be this much fun

  • @oludhe7
    @oludhe7 2 месяца назад

    Literally the best series on transformers. Even clearer than statquest and luis serrano who also make things very clear

  • @AZ-hj8ym
    @AZ-hj8ym 2 месяца назад

    great channel

  • @laalbujhakkar
    @laalbujhakkar 2 месяца назад

    Please continue to make videos if you can. You have a talent for teaching complex topics clearly. Your transformers series really helped me! thank you! 💙💙💙

  • @sharjeel_mazhar
    @sharjeel_mazhar 2 месяца назад

    You have ny utmost respect, ma'am!

  • @laalbujhakkar
    @laalbujhakkar 2 месяца назад

    Amazing explanation! Best on RUclips! totally under-rated! I feel fortunate to have found it. Thank you! :) 💐👏👏

  • @kaushikrao2932
    @kaushikrao2932 2 месяца назад

    i started laughing being ded serius listening to ur explaination

  • @subhamraj7124
    @subhamraj7124 2 месяца назад

    If i am not wrong, training is done in a single timestamp so while decoder should output of total dimension and not one by one. During inference , it generates one by one. SInce masked multi-head attention concept comes under training, it should be in a single timestamp.

  • @marsgrins
    @marsgrins 2 месяца назад

    This is the best. Thank you sooooo much Batool for helping me understand this!!!

    • @HeduAI
      @HeduAI 2 месяца назад

      You are very welcome :)

  • @TheClassofAI
    @TheClassofAI 3 месяца назад

    Fantabulous explanation :-)

  • @RafidAslam
    @RafidAslam 3 месяца назад

    Thank you so much! This is by far the clearest explanation that I've ever seen on this topic

  • @humanity2809
    @humanity2809 3 месяца назад

    This is a true masterpiece! I can't wait for the follow-up videos.

  • @BlockDesignz
    @BlockDesignz 3 месяца назад

    First person to concretely explain why they use a periodic function, which in my mind would give the same position embedding when you come back to the same point on the curve. Thank you!