Two Minute Papers - 2021-05-14
❤️ Check out Lambda here and sign up for their GPU Cloud: https://lambdalabs.com/papers 📝 The paper "Learning and Exploring Motor Skills with Spacetime Bounds" is available here: https://milkpku.github.io/project/spacetime.html ❤️ Watch these videos in early access on our Patreon page or join us here on YouTube: - https://www.patreon.com/TwoMinutePapers - https://www.youtube.com/channel/UCbfYPyITQ-7l4upoX8nvctg/join 🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible: Aleksandr Mashrabov, Alex Haro, Alex Serban, Andrew Melnychuk, Angelos Evripiotis, Benji Rabhan, Bryan Learn, Christian Ahlin, Eric Haddad, Eric Martel, Gordon Child, Haris Husic, Ivo Galic, Jace O'Brien, Javier Bustamante, John Le, Jonas, Kenneth Davis, Lorin Atzberger, Lukas Biewald, Matthew Allen Fisher, Mark Oates, Michael Albrecht, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Ramsey Elbasheer, Robin Graham, Steef, Taras Bobrovytsky, Thomas Krcmar, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi. If you wish to appear here or pick up other perks, click here: https://www.patreon.com/TwoMinutePapers Meet and discuss your ideas with other Fellow Scholars on the Two Minute Papers Discord: https://discordapp.com/invite/hbcTJu2 Károly Zsolnai-Fehér's links: Instagram: https://www.instagram.com/twominutepapers/ Twitter: https://twitter.com/twominutepapers Web: https://cg.tuwien.ac.at/~zsolnai/
I am a dancer. You can't imagine how much insight this paper brings into the fact that every dancer has a slightly different style of movement for the same figure. Thank you!
You know it's a great day when you wake up to two minute papers
You are too kind. Thank you so much!
The best part of waking up... is two minute papers in your cup!
Oh yes! I cannot get enough of AI related content
There’s only one thing as good as waking up to 2MP, and that’s going to bed to 2MP.
Soo true
"This poor thing just finished writing a conference paper and is barely alive" 😂🤣💀
Someone wrote the script out of experience
the 'use more energy' ones cracked me up, especially the first one
I just did. Thank you very much for dropping by! 🙏
“Here im gonna mess with the energy engine to get enough momentum to bunny hop over This Wall and skip the car chase.”
I like Demos more to be honest, it's fun to play around with them rather than watch someone else do that with limited information
Hahahaha!!! That's killer dude.
what are you doing here bucko
The future of AI NPCs in video games will be amazing!
As impressive as this algorithm was, I'm not gonna lie, I'm disappointed that the locomotion result at 4:44 wasn't the weeb run.
Every time I hear DOCTOR Károly Zsolnai-Fehér I do the happy walk.
X2
Imagine when AIs rise up against us and robots start cartwheeling towards us.
I have performed tens of thousands of backward flips and I can tell you that the AI technique was waaaaay better than what it was learning from and more likely to land cleanly, the one it was learning from was rubbish work and would end up with bad knees if not a faceplant every time lol.
some of the resulting movements are a bit a little bit strange, but overall and in principle the results are still remarkable, and being able to combine different movements and create subtle variations will make crowd scenes and busy street scenes so much more believable. is it possible that soon we will also be able to feed in written instructions and use sliders (as in the previous two minute paper)? that would really be fabulous!
how? patreon?
I never thought of crowd scenes, but you are right!
My thoughts went to making robotics more lifelike. But I could also see this being applied to video game characters, to make motions less repetitive.
your own virtual actors!
imagine a game that u can visibly see your character improving skills as you level up.
I'm still a fan of that old Google video of the AI guy running through the maze with his li'l arms flailing. :-D
I remember when it was tough just getting an AI to walk in the first place! Look how far we have come!
Quite a lovely paper, I hope to see this in video games soon enough.
Something tells me that we might already have that wish fulfilled, just not to the same capacity.
Toribash involves physics based fighting with 3d stickmen, but it's more of a proof of concept than a polished game.
It's cool to see things like a split stance for jumping and movement. The "styled" looks more like a natural athlete.
Does this A.I being able to reduce energy during motion mean it could theoretically make machines with simpler motions or certain movements in sport that could be more efficient more optimised?
Incredible paper! Imagine if the AI could learn the style of a parkour athlete based on his motions! I imagine that parkour video-games are going to be interesting! When you improve physical skills, your energy expenditure would be higher and movements more intense! What a time to be alive!!
Those are some awesome parameters! I thought the body volume parameter had realistic results. Figuring out those naturally good parameters could lead to realistic body language and emotion
The moment i saw this video, i knew Prof Yin Kangkang had a hand in it.
It reminded me so much of the time she asked in class: "mathematically, what is a bone?"
Today we get to find out, "mathematically, what is a style?"
Some of this simulations made me laugh so much, thanks Károly you've made my day 😆
I wish all papers would disclose the compute hardware used to train their models. I would love to try some (probably not this model) and have a play with them but I only have a 1080. Compared to servers with mutliple GPUs, it's not much and some models take days to calculate. :(
Sounds like you should check out Lambda GPU Cloud!
@Sheepless honestly I simply don't want to learn how to do that. I'm human and most humans are lazy :P
Surely it's not that hard to disclose the hardware used.
@Ethan this paper does disclose the hardware and software they used, in the 'results' section.
@Sheepless yes! Yay i love it
I would love to work on something like this. Do you know what educational backgrounds the researchers have?
Man Doc, it's so great to hear someone so passionate and excited about science. It's truly inspiring and really motivated me to watch more content like this instead of the usual mindless crap for hours on end. Thanks a million bud, you truly are an inspiration to thousands of bright yet rusty minds and have really made an impact on me.
i want to see this AI blended with the one from a few months ago - where a simulation was used in the robot dog with no lidar or cameras
I always thought locomotion learning should be influenced by energy. While this is not the same, I'm glad it was considered at all.
Woohoo... finally, I've been very much wondering how long something like this would come about considering most of us love to watch dances
.
The styled movement and breaking the symmetry makes it look more human. Sure this will make crowd movement in games or in films look more natural because you won’t spot suspiciously synchronised steps etc
Was going to ask whether or not you could combine multiple references in order to increase the amount of possible "styles", but it seems that the developers went two steps further than that. Awesome.
Hi sir, i am new to your channel and really love your contents. Can you make a video on how the AI say the alphaGO played with itself a millon times,how is it even possible, it should take a lot of years to do that?
I don't know if I'll get into machine learning anytime soon but I decided to start coding yesterday. I'm studying 3D animation in college. I decided to learn coding on my own so I could make my own plug ins and scripts if I need them.
I think once I'm comfortable with basic computer science I'll look into machine learning. I can see a lot of valuable artistic and pipeline potential for it.
This series helped me become a lot more comfortable with advanced computer science, and to feel less afraid of it. It made me feel capable of dipping my toes in to the more advanced topics. I actually talked about watching it when I interviewed for the undergrad research position I did this year. Can't say I'm useful yet, but its really exciting to be part of the research. I probably wouldn't have thought about / felt comfortable about joining research if not for this show. So thank you for that :D
This idea of spacetime is so simple.
Imagen the possibilities for animation blending in computer games by implementing something similar.
This could make programmed cut scenes so much easier if you have e.g. a rudimentary grabbing animation and physical target resolved with IK.
No need for a full on grabbing simulation to make it look good.
I have to wonder how hard this would be to transfer to other types of tasks. For instance, could someone use this on a GPT-like network to have it complete sentences within bounds (convey certain facts for instance) but with different styles (aggressive, consolatory, bored, etc.)
I can't wait for this to be implemented into video games.
That's a brilliant tech demo 😊 did you answer the burning tree for climate models question and I missed the answer? I enjoy your breakdown of papers quite a bit. It's nice to be kinda upto date with what's solved and what not 😊👍
AI and ML are very inspiring fields but 2 minutes papers narrations make it super inspiring 🔥🔥🔥
Does anybody watch this and get excited about the applications to gaming? Imagine a character that navigates a diverse terrain by LEARNING how to traverse in different ways! Or even have different movement style based on the “energy level” of your character.
They should collab with the paper a while back which took keyframes for animation and made the in-between animations via AI. Then use this paper to add more style to those animations. Test idea: TMNT parkour over rooftops, using the one base animation for the sequence, and then this paper to do a style for each turtle (Mikey with more energy/freedom than say, Donnie).
yes man it inspires me as well . thank you so much . Energy from your telling is really like a child like exicting which is very very nice and gives energy to me
There are very interesting and exciting combinations of medicine and machine learning so Nathan go for it.
Now we deserve to have a game like 'the movies', with this paper would be glorious.
Science is so fun, very entertaining !
I am imagining BMX tricks done by AI now... Perhaps 2 more papers down the line 😉
Can't wait for games to have NPCs with simulated life and motions to achieve their needs, then the NPCs would have a limited access to the knowledge main frame where all basic knowledge and intended game wise knowledge are allocated and based on their discoveries they gain access to it or maybe they create new knowledges to be added to the main frame or maybe have it exclusive to them, though i think having many copies of the same knowledge would be wasteful of resources, then you mix that with some sort of chat bot and voilà you have interactive NPCs with AI
When AI can recreate human emotion and inspiration
This may dangerously encourage us to live within a virtual world rather than the real world, when we can make everything more perfectly controlled than our real one. We should as a species, tread carefully on this.
This is going to be huge in 3d-animated movies!
3:31 i started laughing when i saw the middle one :D i just thought it looked very funny for some reason :D
He's doing the DiCaprio walk!
perfect timing! I just got caught-up on your last two videos, and saw this one, too!
veggiet2009 - 2021-05-14
what fascinates me is that the "random style" actually does look noticeably less robotic. It would be cool to give a robot these tools in order to pre-simulate the simple motions that the robot has to do: vacuuming, carrying objects. What ever mundane tasks could in theory begin to look a lot more interesting to watch for us humans.
Francesco Rizzo - 2021-05-15
This, on a global scale, would mean a gigantic waste of energy (and pollution).
Let robots move like robots. Precisely and as cheap as possible
veggiet2009 - 2021-05-16
@Francesco Rizzo if they're in a factory sure, but the robot that's straightening the products on shelves in a retail store every hour throughout the day could stand to have a little personality
TheChristmasCreeper - 2021-05-16
It's probably not a coincidence that one of the 3D models was Boston Dynamics "Atlas" robot.