How to Teach a Robot: From Moving Arms to Autonomy with Physical AI
Show notes
How do you actually teach an AI-powered robot?
For decades, robots in industry have followed one principle: You program every single step. Every movement. Every position. Every exception.
And if something changes, you start again.
That approach is reaching its limits.
As environments become less structured and processes more dynamic, the question shifts: How do you move from programming robots… to teaching them?
You'll gain insights into:
- how to physically guide a robot arm
- what a VR headset, a gripper replica, and a helmet camera have in common
- why data quality matters more than data quantity
- how close we really are to just talking to a robot and getting an answer
More about RobCo: Website:https://www.rob.co LinkedIn: https://www.linkedin.com/company/robco-therobotcompany/ Instagram: https://www.instagram.com/robco_therobotcompany/
Chapter markers 00:00 How do you actually teach an AI robot? 01:13 Traditional robot programming 03:08 RobFlow: no-code meets the factory floor 05:30 Overview: Five ways to teach a robot 06:21 Method 1: moving the arm by hand 08:23 Method 2: the leader arm and haptic feedback 10:41 Method 3: VR goggles as a teaching device 15:39 Method 4: the gripper replica in your hand 17:47 Method 5: motion capture and ego data 22:00 Rich data vs. massive data: What works better? 27:09 How far away is voice-controlled robotics? 31:10 Why humanoid hardware is still the bottleneck 35:42 Learning robots open a completely new dimension 39:00 We're using AI like a typewriter, what's next?
Show transcript
00:00:00: So welcome back today, we are talking about how to teach an AI robot.
00:00:03: And I think that is one of the really fascinating topics especially when you're talking about physical AI because The way our using AI right now Is on a voice level.
00:00:12: and How do You Teach A Robot To Do The Things You Wanted To Do?
00:00:17: We'll talk About the traditional Way That i Think Many Of You Will Know The Concepts But The Future Ways Are Really Interesting.
00:00:22: Claremence Already Brought Some Cool Stuff With Them.
00:00:25: There's Little Box Down There.
00:00:26: Be Excited There Like Real Stuff In There To Look At teaching is done.
00:00:30: I'm really excited because for me, it's always...I always thought there was just some lines of code and then creating the robot to do things but its way more difficult than that.
00:00:41: And actually if you're thinking about a process There are many ways we have to think right now Especially looking at physical AI.
00:00:48: That why i am so excited here in our podcast with Clemens.
00:00:54: Our topic today is How To Teach an AI Robot Now & In The Future.
00:00:59: Rob talk the autonomous robotics podcast physical AI no theory just reality.
00:01:10: Welcome clearance.
00:01:10: Thank you for being here
00:01:12: and thank you for having
00:01:13: me.
00:01:13: yeah, so let's jump right in And So tell us a little bit about the traditional ways of how to?
00:01:21: well How to?
00:01:23: Teach your robot or how to program robots do things we want to do.
00:01:26: How was it until now?
00:01:28: I mean industry.
00:01:30: robots are an old technology.
00:01:31: So back then you would almost have an assembly language, every robot had their own language.
00:01:38: some robots loose use something that looks like Pascal if You know that.
00:01:42: so languages from the eighties.
00:01:45: meanwhile there is usually Some graphical interface on top.
00:01:50: but yeah Robots Are a very mature Technology.
00:01:57: mm-hmm more like if you look at for instance Java, developing Java with an IDE which has become more and more comfortable to code.
00:02:06: Is it that way or is really... You have to know machine code?
00:02:10: Do we need to know how very specific coding works?
00:02:15: I mean meanwhile you can do some high-level commands so just tell them where they go in certain places but then break down all the different pieces into small steps.
00:02:27: And then it just is programmed to repeat exactly what you told me too.
00:02:31: What's the most frustrating part about this?
00:02:34: It, it is cumbersome because its a very manual process right and so things can become quite elaborate in certain projects... ...and suddenly have hundreds of steps that one robot has do perform a particular task.
00:02:49: Basically you always have to code or test it things forward and if you change one thing, then basically have to reprogram.
00:02:57: In the worst case everything right?
00:02:59: That's it or something small changes in the environment... You've got go-in corrected.
00:03:03: yeah that takes can take a lot of time.
00:03:06: So what is your approach?
00:03:07: them do this better.
00:03:08: and um..you had something at Robco called Robflow.
00:03:13: Tell us about how that helps the traditional way of teaching a robot.
00:03:20: It is just a very interactive way, very graphical.
00:03:23: Way to do this.
00:03:24: so we don't have just lists of things that you Do.
00:03:26: You Have A Graphical Flow and the Parts Of The Flow Are Pretty High Level.
00:03:32: So People Who Use it Say Wow This Is Very Simple And You Can Go In.
00:03:38: You Have All The Parameters That You Need But Also Not More And You can Move Robot in A Particular Position Then Take Over Those Parameters And So On.
00:03:49: I mean, RobCo has started with the goal to make these things very simple and this is as far it gets traditional robotics.
00:03:58: Cool!
00:03:59: And so when you're looking at the RobCo system how does that makes much better than in a traditional way?
00:04:08: Like...
00:04:10: You have all of buttons on right places but don't know programming And it's very graphical.
00:04:16: You get lots of feedback, you see how the robot looks like in a simulator and so that makes them easy.
00:04:23: And who usually uses the Robloff system?
00:04:25: Is this customer or is it you guys implementing the Robot?
00:04:29: I think its both!
00:04:30: So we have customers using these to program their flows directly But then often enough, it is all embedded in a project where the goal was to implement a particular solution for customer problem.
00:04:45: And there are professionals that come and do this kind of stuff so we're also making our own lives more efficient.
00:04:54: Absolutely!
00:04:55: Have you seen situations because of similar systems?
00:05:03: A robot is used instead them saying, oh no this way too complicated.
00:05:06: We'll use a human?
00:05:08: I think it's still an area where you need professionals right and so i think there is like we wish we can...we could do it hundred times easier.
00:05:20: that's also where our vision as right eventually want to just talk to the robot then have understand its environment.
00:05:26: Yeah!
00:05:27: That was next topic were coming how to teach robots in future
00:05:30: Right?!
00:05:31: And obviously different teaching methods And I would love to jump in with those, because we already had a pre-call.
00:05:39: We talked about the different... There was five levels of teaching robot.
00:05:43: maybe you can take us through these methods?
00:05:45: Because they are going to be interesting in future right?
00:05:50: Right!
00:05:50: So there's actually much more.
00:05:51: so Fiverr is just short list for things where everything totally makes sense and i'm going thru couple them but too confusing, because I won't say this is the one.
00:06:07: Because in the end there's probably a couple of ones that we choose?
00:06:11: We have a certain idea on which ones will work best but you know everybody in the industry making their little bets and what works best.
00:06:19: so it'll be ours as well!
00:06:21: But generally like this... You can start with For example, moving the robot directly and just recording what changes in its settings where their joint angles are.
00:06:37: Moving the robot arm for instance
00:06:39: right?
00:06:40: So that would be most direct thing to do it.
00:06:44: It's pretty cumbersome especially when you also want to use sensors.
00:06:48: then You probably have a person who is in view Who moves the robot arms.
00:06:55: You lose certain forces because you're basically just using it as a human means that the robot itself cannot record.
00:07:03: So there's certain pros and cons, but it's the most direct way of doing things.
00:07:08: And then on the other extreme... ...you put a video camera on to film a human doing stuff or watch YouTube videos.
00:07:17: That's
00:07:18: crazy.
00:07:18: The range is what we can see now Like steps in between.
00:07:25: Yeah, what are you more interested?
00:07:29: Let's start with the robot arm and then go up for some of them or difficult ones because if they're all but I mean, I understand it is much easier to get a positioning right with the Robot Arm If You Can Move It To The Position Press Record move into next one press record, you know moving to the Next One And Then They Maybe Even Give Him The Shape Of The Movement.
00:07:47: Because If You Have To Program That Like With Traditional Code Putting In Every Coordinates Is Obviously Compassome But then obviously it's kind of like.
00:07:56: that is the most basic teaching way, I guess.
00:07:58: And still takes a lot of time if we're looking at AI right now and what we are envisioning?
00:08:05: Then you want to tell the robot hey do this for me or... That' s all we doing with AI Right now.
00:08:10: We telling the AI Hey!
00:08:11: This result i want Please Do that For Me.
00:08:16: The steps towards data getting more interesting.
00:08:19: Okay so one thing You could use second
00:08:22: arm Okay, right?
00:08:23: And then copy those things over.
00:08:26: Now I've brought one.
00:08:29: Did you break a second robot
00:08:31: arm?
00:08:33: Actually everybody can build themselves
00:08:36: so cool.
00:08:36: So this is something that has open source and it's used by lot of students and people who just want to build a robot at home.
00:08:44: It's also relatively cheap.
00:08:45: but the only thing here small drives here where you can read the position from.
00:08:54: Obviously, there's also the expensive version where you have a full replica of the robot arm that your interested in and it even has force feedback.
00:09:06: so... You go to the extent when you move this teacher arm or leader and you touch something with this follow arm that works in parallel, then you get this haptic feedback.
00:09:24: And this can either be a one-to-one replica it could be a smaller one.
00:09:29: there's also some special haptic devices which allow you to get this feedback without having the full arm or this is a cheaper version now.
00:09:37: This is good for prototyping and testing.
00:09:43: Now you can basically here have a full six degree of freedom arm.
00:09:46: Interesting, so you take it into your hand and move around.
00:09:50: then the robot arm which is the real arm or expensive one will move in same way as you can guide by doing this?
00:09:59: Exactly!
00:10:00: So nice thing.
00:10:01: we know what we call embodiment gap.
00:10:05: You can read from your robot arm that you want to move The exact positions But of course, it is still a little bit cumbersome.
00:10:17: It can work in some cases but you also want to make sure that this leader arm is sturdy enough so somebody might sit on it at some point right?
00:10:30: So we don't wanna break this every time.
00:10:31: So its one option to do it and probably not the only one that would use.
00:10:37: interesting
00:10:38: then I have another.
00:10:41: Any people would know this.
00:10:42: Oh yeah, Oculus right?
00:10:44: This is MetaQuest but it could be any other brand.
00:10:49: there's a couple of these VR headset brands now and you have this little controller.
00:10:54: And by using that These things are really small miracles from our technology point-of view.
00:11:02: That don't cost that much and they can find out where exactly in space this controller Is A bunch of buttons.
00:11:11: Here's even a linear button that doesn't only know like on or off, but it has a bunch of values in between that I can read out.
00:11:19: Grip
00:11:19: hard, grip loose right?
00:11:21: So pressure sensitive?
00:11:23: Exactly so.
00:11:23: if i have pretty simple linear gripper ,I could use that to manipulate it.
00:11:31: and by with this combination the glasses find out with sensors here where they are.
00:11:39: So they look at certain features from the room to find out whether their head has moved.
00:11:44: They have an IMU unit so that I know there are accelerations, this way really finds where our headset is in space and then we have a technology for finding out what this controller is relative with the headset.
00:11:58: This way you can now steer the gripper or end effector off of your arm by just reading all these positions.
00:12:08: Um,
00:12:09: disadvantageous.
00:12:10: Now you have no feedback really if something... If you run out of the workspace off a robot and you have to react all these things?
00:12:20: If you crash into the table um You wouldn't even notice here but he would hear it see its.
00:12:27: So so that's another thing.
00:12:28: And of course your restricted too.
00:12:30: just the end effector in.
00:12:31: you Have to reconstruct what the rest of the arm should be doing.
00:12:34: Yeah This is all possible!
00:12:36: You can use pretty well.
00:12:38: And the third thing is, since you have no tactile feedback as an operator.
00:12:46: You'll have to make up for that because your totally dependent on visual system.
00:12:55: so we saw it works pretty well but its also not most ideal in all different use cases and these are consumer devices questionable if you want to be on a factory floor and you always get to the home page where I can download games and stuff.
00:13:15: Yeah, so that's one
00:13:18: That is interesting.
00:13:18: So use your own software To basically show The arm in the VR goggles And then it move In that space with obviously With the view of all the objects You Want to Move?
00:13:32: It's very Interesting!
00:13:33: I understand the limitation is can only be replicated in a visual sense, but not an haptic since you could go out of that range and then do something.
00:13:45: That it's not possible.
00:13:46: It would be recorded.
00:13:47: But if we record it out of bounds
00:13:49: Yeah.
00:13:49: And this where for the first time We get what?
00:13:52: Be called as embodiment gap because You Can also move This controller For example faster than how The robot can Move.
00:14:00: um...You don't have the same properties In terms Of Range of Acceleration and other forces.
00:14:08: So now you have to bridge that.
00:14:10: Yeah, so at what stage would use it?
00:14:12: Like probably uses one of the things too create the movement but not the only thing.
00:14:19: I think It really depends on the use case.
00:14:21: if she had relatively simple movements is perfectly usable for other movements.
00:14:26: we see That's probably not a most efficient way.
00:14:29: so they're movements are maybe a bit slower than how he will do as human.
00:14:33: and You Have like.
00:14:34: I mean did you have the button But the button is not very precise with the tactility you want.
00:14:41: If you need to lift something or grab something carefully, I think it's still difficult to do that with VR
00:14:47: goggles.".
00:14:50: You have again look at it and find out how would you do this?
00:14:55: And then your feedback loop by using your own vision system.
00:15:04: The VR headset has one advantage.
00:15:06: You can also use it to teleoperate the robot, so by projecting a camera view into this screen you could be in different room maybe even on another part of world if your have connection.
00:15:19: that is fast enough and so maybe its possible there to correct certain errors or investigate what's going wrong.
00:15:30: I mean obviously thats very cool.
00:15:34: And then we have this device, also pretty interesting.
00:15:39: Here you have a replica of the gripper device in your hand and it also has some other sensors like a camera or also an IMU.
00:15:51: In this case It would find out where is space by having beacon that somehow sits on room so it localizes itself against that beacon.
00:16:02: Does it have a pressure sensor?
00:16:05: How strong you're gripping or just how far are you gripping.
00:16:09: Just, I think having pressure sensors in there is definitely something that would want at some point.
00:16:17: but here for example lifting this is now perfectly possible and so whatever what ever can do with the scrapper then eventually also do with robot itself.
00:16:32: So basically what you're doing, I mean this is really cool.
00:16:37: You have to be can set up and then go into the object that you want to pick-up.
00:16:40: use the gripper.
00:16:41: The gripper knows exactly where it's in space how much it has to grip And send back to the software.
00:16:48: Then the robot arm next iteration know at which position it needs to grip How much?
00:16:54: Yes we still need make sure It uses the same coordinates.
00:16:59: So what you have to do is, has a second work table or maybe move the robot away from this space because now.
00:17:07: You operate there as a human.
00:17:09: so one small problem could be that the camera sees what you see?
00:17:14: So and which might be different from the time when your operator of the robot itself gets into view it has a different information while its running than why are you operating yet?
00:17:26: but then I think really makes clear What can actually be done by the robot?
00:17:33: Because if you can't do it with this thing, The robot also won't able to do interesting.
00:17:38: Yeah That's cool All
00:17:41: right and then I mean so.
00:17:44: So the more elaborate version of that is If You have a complete replica Of A human as A robot or Humanoid yeah Then You could Also use motion Capture for example.
00:17:58: But that becomes pretty elaborate.
00:18:00: So motion capture works by having points like markers on your body, maybe by having a special motion capture suit and then having bunch of cameras around you between let's say twelve and twenty four cameras.
00:18:15: And so whenever you do movements it can directly record at your position with all the different pieces in very high precision.
00:18:25: But that only makes sense for if you really want to have a record human movement.
00:18:30: So when you think about the dancing robots, and it's basically how these data sets were made?
00:18:36: I mean there'd make sense but for factory floor usually don't need them much.
00:18:40: maybe in the future When your thinking of really elaborate setups right now i still like pretty much arms movements
00:18:50: Exactly.
00:18:51: So I think that's a bit overkill.
00:18:54: for industrial settings, there are like trimmed down versions of this where you may only have very accurate glove or the hands which is good data source because it could also have tactile sensors in their hands.
00:19:11: But then the question is first of all, these are also very expensive.
00:19:14: Maybe you're so quite brittle.
00:19:17: and Then a question as okay What do you when?
00:19:18: You have this hand data.
00:19:20: So now we have another bodyman gap.
00:19:22: either you employ a replica over human hands.
00:19:25: Yeah Which is a very hard problem right.
00:19:28: Also imagine you want to have a hand that That lives for more than a few days And And of course, you might not have the same senses on hand than what a human has.
00:19:44: Things like tactile feedback is something that's still an open area for research.
00:19:52: and then go even further to say let's don't bother with such gloves but just film them directly.
00:20:02: so one thing which became pretty popular over is what we call ego data.
00:20:08: Okay, so all you have a little camera on your forehead and it films while doing certain tasks that film your hands then run the pipeline of steps to understand okay?
00:20:18: What's actually being done?
00:20:21: So for example We have models which can turn pictures into language.
00:20:24: right everybody knows that.
00:20:26: You know exactly.
00:20:27: Now there are connections between what somebody does And human languages the other way around.
00:20:35: I didn't say, okay now please do that and so it has a connection with the physical
00:20:39: world.".
00:20:40: So that's actually good.
00:20:42: segue into saying we want this level five autonomy.
00:20:47: We just wanted to talk about robot and introduce things but its very long shot right?
00:20:51: Its ton of problems you need solve there.
00:20:57: Also, you now only have the human hands.
00:20:59: You don't have any tactile information.
00:21:01: You know what's a human grips?
00:21:03: You have occlusion so that fingers can be behind other objects.
00:21:09: So all of these things are actively being worked on.
00:21:12: people make huge bets and People collect huge data sets around it.
00:21:17: And so The question is how much will help us get towards physical AI?
00:21:23: I
00:21:24: mean that is a really interesting question.
00:21:26: So with the ego data, how much data do you need?
00:21:29: Because i'm pretty sure once it's not enough whereas in second arm one or two times we already have lots of data which makes it very accurate because your physical space but with head on data its easier to collect and understand.
00:21:45: camera on top of helmet is easy to collect data from.
00:21:48: But then you must need hundreds maybe thousands of iterations.
00:21:53: really precise understanding and the GPU power behind that is also correct quite crazy.
00:21:58: Yeah, it's kind of a crazy field right now because of course you can just take a really big hammer yeah And take all of YouTube and try to learn things from there.
00:22:08: people are trying do that.
00:22:10: they have made lot progress on this but offcourse its garbage in garbage out.
00:22:17: what eventually model has to learn is whole distribution.
00:22:22: There's things for example, like how big is a pixel?
00:22:25: You cannot really learn anything that is below the size of an pixel.
00:22:33: Yeah so it still question off either you have small amount very rich and precise data or huge amounts not-so-precise data where you have to impute.
00:22:50: So in the end, it's going to be a basket of different data sources.
00:22:55: And then the interesting question is OK how much of that burden do you leave to the customer?
00:23:01: Yeah and can you pre-train models such that... You have all these data sources behind an application but only need very small amount to implement particular solution or problem.
00:23:17: I mean, what i could see happening which makes a lot of sense is like giving the robot an overview.
00:23:25: What has been done through the helmet camera and then but being very precise with it additional arm?
00:23:30: Like the army just showed or maybe even though the gripper was being very precisely than reducing the learning time on videos from thousand fold to tenfold.
00:23:40: Maybe these combinations are going to be interesting in that this solution is quite interesting as well.
00:23:45: also you know the LLM models evolve and capture more data, a more understanding.
00:23:51: And I think one of the hardest things talked about is also like teaching a model-like an AI model to real world.
00:23:58: it's...I think its really hard right?
00:24:00: Yeah but this exactly how..how that comes in?
00:24:03: Right i mean you can also run things in simulation But there always gap between simulation & reality.
00:24:11: so In best case You just have plenty of data from The Real World.
00:24:17: Now, we know how a human learns.
00:24:21: How to learn about the physical world?
00:24:23: So if you think about our first year of a baby that's maybe a thousand hours and afterwards it knows all of Newton mechanics.
00:24:30: Yeah true!
00:24:31: And he knows all about different materials.
00:24:33: now they behave in that something... If something rolls behind the wall then comes out on the other side and so on.
00:24:40: That is the yardstick.
00:24:42: but where we see that It IS possible You just haven't found way to get exactly this level.
00:24:49: So right now, we're talking about a hundred thousand maybe ten thousand times more data because the models just don't are not optimized.
00:24:58: as much is what human does.
00:24:59: and to be honest that baby has incredible sensors.
00:25:02: it has yes
00:25:03: an incredible neuroplexity.
00:25:07: so they learning centers activated like crazy.
00:25:12: And that's when you see a baby doing things like five times, dropping the ball.
00:25:15: Five times or even ten times always so surprised to think I can't believe this is happening but then it understands and then remembers
00:25:24: yeah in it comes down too Even empathy.
00:25:28: You watch your video?
00:25:29: Many people if somebody gets hurt in the movie Yeah, you feel yourself right.
00:25:35: So we're kind of filling In The gaps When We See Something.
00:25:41: And so it's similar here when you have a data source that is not as rich, for example no tactile feedback.
00:25:46: You better have seen cases before which are similar with tactile feedback because then the model can learn how to fill in these blanks
00:25:54: and I thought the models were already much further but they need so much more data.
00:25:58: It's impressive.
00:26:01: The advancements are really fast though if you remember uh, these vision models were a few years ago with Will Smith eating spaghetti and the progression.
00:26:13: How it's how is now five years later?
00:26:15: It's an incredible advancement.
00:26:17: And I think we will see very similar advancements also in next three years
00:26:22: especially then talking about life recognition.
00:26:24: Um, I think that's going to be all so very interesting when the model is so fast in recognition that they can do it on live.
00:26:29: So it can like basically process all of data in life doesn't even need to watch video or learn.
00:26:36: It's basically all happening at the same time.
00:26:37: I think that is going to be in the accuracy we see with Will Smith spaghetti eating video, these things are very interesting.
00:26:45: and then obviously computer goes down but it gets even more interesting.
00:26:53: We're doing this as well like learning while you're operating But it's only like the small tip of the iceberg, right?
00:27:02: So behind that is pre-training on tons and tons of data.
00:27:08: That was interesting.
00:27:09: so how far away are we then from talking to a robot?
00:27:12: I'm just telling him hey do this Do that And then it executes It Like basically like a chit chat or Gemini does Right now.
00:27:20: i think This Is A Trillion Dollar Question Right?
00:27:22: It Could Be Anything From One Year To Ten Years.
00:27:25: Wow I would rather put it more on the shorter end.
00:27:29: Cool, that's really interesting!
00:27:32: Thinking about having a robot understand what we want and then execute for us... That is basically the dream with all been waiting for right?
00:27:43: It can definitely help solving things like labor shortage
00:27:49: because training must be there.
00:27:51: so if you step one step back when i said I've always wondered why, you know humans are being used more than a machine for instance.
00:28:01: Because i was thought that the automation has made sense.
00:28:03: but obviously when it's really hard to do something with robot like let say ten years ago um The programming is so hard.
00:28:11: maybe even the robot arm didn't exist or the machine didn't exists.
00:28:14: You just use humans because they were More available cheaper.
00:28:17: Maybe if we look in countries where labor is cheaper then That would be used.
00:28:22: because well you teach human and he doesn't Nineteen percent of the time, right?
00:28:28: But now we're coming to a situation where you buy a couple of robots and You can reteach them on a Monday morning And they I'm being very optimistic here.
00:28:36: We tell him on a monday morning what he want to achieve and They are like okay, we understand then these do that thing Monday afternoon already and i think before That planning over fact was way different had to plan ahead.
00:28:49: it took months too or weeks To get everything set up.
00:28:52: yeah so It's not going to be a big bang, like you described.
00:28:57: I think it is much more gradual and will also be restricted in the beginning.
00:29:03: what kind of use cases can do.
00:29:07: but we'll have this gradual transition out from our perspective being developed in industry robotics with what is coming out of humanoids.
00:29:20: So we're expanding from the industrial robotic space, you know whatever we can do so We don't have to solve a whole problem In order To create value.
00:29:28: there.
00:29:29: Of course I mean if You looking at the robot arm for instance.
00:29:32: get that Do things With The voice input.
00:29:36: i think That's going to be pretty fast because it's obviously an unlimited Space.
00:29:40: but with A humanoid robot Obviously Looking At Ourselves Always Taking Us As The Model We always think about us being able to do everything.
00:29:48: humanoid robot will probably not be able to Do everything but with the voice input that's going to be The.
00:29:54: well, that's gonna.
00:29:55: Be the opinion That this must be possible of a question is you know Is it?
00:30:02: There are still huge challenge together.
00:30:03: humanoid robots who things like and work like a human or is that like?
00:30:08: is that so far away?
00:30:10: So I think I think there's a fundamental limit to what machine learning can provide and the limit in the end is hardware itself.
00:30:18: So, the hardware has changed little.
00:30:22: we can mimic human motions for example.
00:30:25: but uh... The human being such an incredible thing right?
00:30:30: You have a tactile sensor that covers all of your body.
00:30:34: you know it gives signals in all sorts directions.
00:30:39: It self heals.
00:30:40: Yeah, it has the most incredible gripper that you can think of.
00:30:46: So I think these will even if we solve all of the modeling parts.
00:30:51: this is still where the limits are and i think we shouldn't really make a mistake to say only because something looks like human but also cover everything In the end, if you've ever seen a humanoid.
00:31:06: You realize pretty quickly it is a machine and maybe has a USB port or something like this very mundane things.
00:31:13: And so then it loses a little bit of the magic right?
00:31:16: So we want to make sure that The machines that we build are useful and That we know exactly how they behave in how We can create value out Of them and Then and then we Can also do Things that we have never done before and solve the problem that, you know.
00:31:36: The workers retire and we don't find a replacement anymore right?
00:31:40: And so suddenly we can help somebody really solve very valuable
00:31:44: problems.
00:31:45: especially when you have like experts working on certain problems and let's say they're five years to retirement but do put a helmet cam on them.
00:31:53: They are doing things I know even in different situations it is very complex Let us say fixing complex machines or something.
00:32:00: You videotape that and you have that information just available.
00:32:04: I think there's also value for a person who needs to fix the machine again in ten years, but at some point obviously teach robots do it?
00:32:14: One of things that is blowing my mind as always... The way we can gather information on how the world is creating information right now like YouTube has mentioned this huge library about basically any topic Also very specific topics where Refollow us and has ten views, but he's I don't know fix something really crazy.
00:32:36: that information is Only accessible to our human minds after a lot of research.
00:32:41: But through an LLM probably accessible within two minutes And laid out in the version that fits the problem That we have right now because it can adapt and?
00:32:59: implement that at a certain page or humans, at least.
00:33:03: And so we can help the factory owners to keep their factories running even though they human talent might have tired?
00:33:10: Yes!
00:33:11: Or if you think about my perfect example is always craft anything there... We don't have enough people and also things are not economically viable anymore.
00:33:28: in many cases If it takes three or five times the amount of money to do something by hand, people might resort too.
00:33:37: You know?
00:33:38: The chain store again I think if you have a way to teach your robot what did he do byhand economics change and will make certain products now feasible which were previously just impossible since mass production has been introduced.
00:33:57: No that makes absolute sense.
00:33:59: I do agree with that, like thinking about creating cupboards or something.
00:34:05: Obviously now mass production is very cheap but building something specific and expensive then there could be a middle ground for the robot.
00:34:13: Yeah so you can teach your robots maybe in future.
00:34:19: But also it's slightly different goal right?
00:34:22: The mass-production means if want to repeat something to the micron level.
00:34:29: And now the goal could be different here.
00:34:32: Maybe we're not as precise anymore, but maybe things are a little bit more individual than the products that I created and maybe slight differences between every piece you create?
00:34:43: That's an exciting avenue!
00:34:47: The difference is very interesting because this is what we see with LLMs happening right now too.
00:34:53: This same prompt put in a thousand times will generate different results with the variance and if you know traditional computers, they don't have variants.
00:35:03: Code did not have variants until.
00:35:06: well it didn't have bugs but then the bug would be repeatable.
00:35:10: now what we see is variants in the outputs which are sort of creative.
00:35:14: But yeah I see that at this point It's also interesting to see because Yeah That makes... You have different setups you have total one hundred percent micro level preciseness maybe the mixture of price and preciseness.
00:35:30: And then you have the creativity over human, which all will be interesting to for the customers to decide on what they want?
00:35:38: Yeah I think it's.
00:35:39: we'll create a third way of doing things where before The Industrial Revolution everything was done by hand at very low productivity.
00:35:49: Then you had two hundred years of making everything mass-producible.
00:35:53: If we look around in the city, Everything is mass produced.
00:35:56: like if he put somebody from twenty years ago into a City From today they would looked.
00:36:02: I mean They would find it really weird.
00:36:03: Yeah Like We don't Really notice It anymore.
00:36:05: but if You know Every Part Of A building Is Somehow A module That Comes out of The Factory?
00:36:10: It's Very Weird.
00:36:12: and now i think we can see that.
00:36:14: you know.
00:36:15: Create New Products Which are like a Third Way.
00:36:18: You know having everything done by humans might be ideal, but it also requires a certain economy right many people that earn very little.
00:36:28: So we don't want to go back to that state.
00:36:31: We
00:36:31: can go beyond what?
00:36:32: What do you have today?
00:36:33: absolutely agree Yeah, what an interesting talk.
00:36:36: so just to wrap up we talked about how Interesting the different levels are.
00:36:41: you know how they all is.
00:36:43: there's different ways of teaching or robot How-to understand its world And I think that is very interesting.
00:36:48: We didn't talk about any vision sensors yet or anything, but we... ...I think the understanding of where something in the physical world is when you just used AI with text has never occurred.
00:37:00: this was a real problem But it's huge obviously Very interesting.
00:37:03: how can challenge that?
00:37:04: Then understanding baby as such massive processor for first one to three years Also very interesting.
00:37:11: there also an example We just talked about like grabbing a glass.
00:37:17: And this one obviously is transparent, but if you grab our hard thing... You cannot see your fingers behind the glass!
00:37:23: That's something that a baby understands within.
00:37:25: I think maybe their first eighteen months?
00:37:28: Earlier actually six to eight month
00:37:31: yeah exactly.
00:37:32: But we really have to teach it because for before a baby for instance It's really weird that its fingers are gone and robot doesn't understand either.
00:37:39: me for the robot Maybe it doesn't matter right what we have to.
00:37:43: I think that's also really interesting.
00:37:45: And then the progress, obviously with the both with spaghetti video when this gives me a lot of hope That we can at some point We really like give it tons of data and will learn from their data because he hasn't proved in five years either.
00:37:57: i would say It's a thousand fold better now The newest videos where he and his son are sitting on a terrace eating spaghetti actually looks quite good.
00:38:05: Yeah
00:38:06: um yeah.
00:38:06: Then obviously if somebody who is running a factory or running like production centers.
00:38:13: I mean, this is a great time to be in right?
00:38:15: Like maybe as some find words what it's one thing they should understand about where we are now
00:38:22: don't expect that learning robots will solve the exact same problem that you would have solved with traditional robotics.
00:38:30: i think thats the wrong approach.
00:38:32: okay It opens up completely new dimension of things that can do and even thinking about automating.
00:38:40: If you can do things with traditional robotics, we have solutions for that too.
00:38:46: You better use those because then it's exactly the type of thing that should be used.
00:38:50: But there is always a good starting point to just get in touch with RobCo and talk about what this solution could be And if its fit or not.
00:38:58: maybe theres some new future solutions.
00:39:01: This is where I see when we are with AI and physical AI right now.
00:39:11: like our parents used the computer as a typewriter.
00:39:14: So we changed from typewriter to computer and they basically use the computer just like a type writer, it took us years for us to evolve into using what's right now with mobile phone that has everything on there... With games and creative design in three-D objects.
00:39:31: I mean it is the typewriter who can do this And my feeling was we are using AI so let me get correct text which is faster And we're using it as just a little advancement.
00:39:43: But I think the really creative stuff is coming.
00:39:46: and also in physical AI, The really creative staff.
00:39:48: when you've looked at some of the parts... That's what your working on.
00:39:53: that amazes me.
00:39:54: Some concepts are going to change A lot.
00:39:58: things will happen within one or two years.
00:40:00: We haven't thought yet.
00:40:02: Yeah absolutely From our age like always thinking about okay What has lost right?
00:40:09: But as soon as the next AI natives now come about, they will think of it in a completely different way.
00:40:16: I love that!
00:40:17: Well thank you for listening to Rob Talk and this episode And obviously if you like the episode please subscribe and hit the Like button.
00:40:25: We'd love that and send some comments as well.
00:40:27: we always look at the Comments and we'll answer their comments Aswell.
00:40:30: Otherwise get in touch with RobCo Get into touch with Clemens and talk to us About your needs In the robotic and physical AI space.
00:40:37: And obviously, if you want to work with us or work at RobCo always send your applications our way as well.
00:40:44: I can only say thank-you for the interesting discussion.
00:40:46: i was love it when we're getting deeper into the insights and then also learning so much about physical AI.
00:40:52: Thank You Climates!
00:40:54: See again.
00:40:56: When is next session?
00:40:58: Physical AI in
00:41:03: RobTalk Subscribe on Spotify, Apple Podcasts and YouTube.
00:41:10: See you on the factory floor!