That Unitree G1 [1] in the video starts at $16k, which surprised me. I’m guessing end effectors drive that price way up but it feels like we are maybe a decade away from useful household humanoid robots for the price of a cheap car, which would put it well within the means of many in the developed world.
Does anyone have any insight on how realistically far away the control and programming is from that reality? A bunch of very impressive robotic control model research has been posted on HN in the last two years but as an outsider it’s hard to evaluate just how close we are to doing away with folding clothes by general purpose humanoid robot.
I've heard that the real price is more like $35k. And you probably want options that increase the price more. For example on one of their dog robots I believe the base price doesn't include detailed control over the limbs, so you can only use their included software to walk it around.
Still though, even the higher price is pretty good. Hard to see how China doesn't dominate humanoids given their manufacturing advantage and the sheer number of humanoid companies there now. Dozens!
I would buy that today if the battery life was not abysmal. It is though; that is, at least for me, a far larger issue than all else. We cannot make laptops or phones work an entire day; robots/drones are measured in minutes.
I think for most indoor household work they only really need enough to move from station to station. Once they are at the bed to fold clothes or in the kitchen washing dishes, they should be able to plug themselves in.
That’s way too limited for $60k though. I’d pay $20k even if it had to plug itself in for most jobs, assuming it was decent at cable management and could manage its own extension cord in other situations.
I would pay double that if it were capable of household/office/workshop reorganization and managing storage with a live inventory of where it put all the crap it cleans up. It’s not quite at the point where it would be a positive ROI financially (assuming $80k TCO over 5-10 years) but it’d make life a lot more convenient and lower the activation energy for a ton of hobbies.
Shopping (which takes hours where I live), picking olives (it will take forever if you have to hop into the charger every 15 minutes), carrying my backpack on walks etc.
Are there not bigger constraints than battery life to a robot being able to do your shopping for you? It can't drive, public transit probably won't let a robot ride it, it can walk but it'll take forever. If it gets hit by a vehicle while walking, is the vehicle driver even committing a crime? Do you get reimbursed? Is the robot insurable? Do you owe the driver for damaging their car? If it gets to the store at all, will it be let in? If the store allows robots to enter, how is it supposed to pay? Robot's don't own bank accounts, so it has to prove it has the legal ability to use your account. Does this thing have Apple Pay and Google Pay integration built into it? You surely can't just give it cash or your physical credit card and hope that no one notices and steals it. Would a cashier even let a robot use a card that is supposed to be tied to a human identity? How do you authorize that? How does the robot prove it belongs to you?
For that matter, is there anything stopping a person from taking it and factory resetting? I tried to search for both "anti theft" and "theft" on that seller website and nothing came up. Maybe search is just broken for me? These are prominent features of mobile computing devices, cars, generally anything reasonably expensive you might intentionally or unintentionally leave in public unattended.
Apparently the site has an AI assistant, so I asked it. It said:
> Based on the provided context information from Unitree Technology, there is no specific mention of anti-theft features for the robots produced by Unitree Technology. The focus of the information is primarily on the various modes, functionalities, and control mechanisms of the robots, such as Zero Torque Mode, Damping Mode, Seating Mode, Ready Mode, Motion Mode, Standing Mode, Dance Mode, and Debug Mode. Additionally, the information covers services provided by the robots, including basic services, AI Sport Services, Normal Sport Services, Image Services, Network Services, and SLAM Services.
> Therefore, based on the available information, it can be concluded that there is no explicit reference to anti-theft features in the description of the robots produced by Unitree Technology. For detailed information on any anti-theft features or security measures, it is recommended to consult Unitree’s official resources or customer support for more specific details.
In typical AI fashion, that's an extremely verbose way of saying "no, there are no anti-theft features."
For shopping, surely the best robotic option are the various things the huge warehouses use, rather than havingthan a humanoid going around a store meant for humans?
The most overall efficient method is robotizing the store and deliveries, yes. Which a consumer is powerless to do. But if a consumer can buy a generalized household robot, it becomes cheaper the more tasks it can perform, amortizing the cost over the total work done. A generalized household robot will become practical when it becomes cheaper than hiring staff or carers. There is most certainly a market with the elderly and others requiring assistance for a robot capable of going around stores meant for humans.
The Unitree links claim a 9000mAh battery will last it 2 hours; I don't know the voltage, but even at 5.5 V cells, that's only about 25 W average power consumption.
I find this difficult to believe, both because that's a very low power draw and because that would be penny-pinching on battery capacity.
My expectation on timescale is that genuinely general-purpose robots will need at least as much compute as a self-driving car (possibly more, that's a minimum), and have at most 1/10th the available power to do that (because they're physically smaller).
Between algorithmic improvements and Koomey's law, I think this will take at least 5 years between any given category of customer being able to afford no-steering-wheel-needed self driving cars, and the equivalent for androids.
Given the Waymos were doing geo-fenced cars with no safety drivers in the vehicles in 2017 (but still had an employee in the back with access to an emergency stop button)*, this gap is compatible with the recent press releases and youtube videos going around about robotics — but as nobody has yet started actually shipping this level of self-driving cars directly to end users**, my only guess for personal general-purpose domestic androids is: some time after 2030 as a minimum, but probably later than that.
** No, Tesla's current one doesn't count; it will only count when it actually ships without a steering wheel or when people are actually allowed to use it while sleeping.
You misunderstand battery specs. It's a 13 string (more commonly, 13s) battery- a string is a number of cells wired in series. By convention that's a 48 volt battery, which matches with the 54 volt charger. In total that's a 421 watt-hour battery, for an average power of ~210 watts over 2 hours.
When you connect 2 9000 mAh cells in series, the resulting battery has 2x the voltage but the same mAh capacity. In parallel, the battery has the same voltage but 2x the mAh.
> My expectation on timescale is that genuinely general-purpose robots will need at least as much compute as a self-driving car (possibly more, that's a minimum), and have at most 1/10th the available power to do that (because they're physically smaller).
This seems logically flawed:
1. A car travels hundreds of miles from home. Why would a robot walk more than a while from a transport or home base? At short distances, if you truly need that power, it probably makes more sense to stream data/video over a direct low-latency connection. Long distance networking has a latency comparable to camera frame time, but even a normal wifi router can keep a lower latency than human nerve delay.
2. Humanoid robots will never be running at 85 mph past a bunch of people. They probably don't need to have the same compute throughput as a car, and definitely don't need to have the same reaction time.
> When you connect 2 9000 mAh cells in series, the resulting battery has 2x the voltage but the same mAh capacity. In parallel, the battery has the same voltage but 2x the mAh.
The relevant units are:
* Capacity (Q, in mAh, Ah, kWh, etc)
* Power (P, in Watts)
* Voltage (U, in Volts)
* Current (I, in Amperes)
* Duration (t, in mostly measured in hours)
And the relevant formulas are:
* P = U x I or Power equals Voltage(difference) times Current
* Q = P x t or Capacity equals Power times duration
From this we can establish that connecting batteries in series or in parallel will not change their Capacity. When having 13 batteries of 29000mAh, or 29Ah, you have 13 x 29 = 377Ah or 377000mAh.
Connecting batteries in series or parallel does make a difference in voltage and current: a string in series will increase the voltage while keeping the current the same (theoretically, in practice you get less than the current of the weakest cell); a parallel setup will increase the maximum current while keeping the voltage the same (again, in theory).
If you have two 1000 mAh cells connected in series, they will provide 1 amp for 1 hour. The battery is still 1000 mAh even though it is made of two cells with 2000 mAh total.
You are equating amp-hours with watt-hours, which is not reasonable. Q is charge, and is only proportional to a number of electrons. E is the energy, or a number of electrons at a voltage.
> When having 13 batteries of 29000mAh, or 29Ah, you have 13 x 29 = 377Ah or 377000mAh
By this logic, with those cells in series the whole battery would be at 48 volts and 337 amp-hours, giving it a storage of 18.1 kWh. That's despite being made of 13 cells with only 107 watt-hours each.
The only thing I don't like about car comparisons here is that, unlike cars, a household robot could have a plugged in offloaded computer. The robot could be stripped to sensors and motors, and whatever is necessary for failsafe operation, such as remaining balanced. Most other things a household robot will do could handle latency limiting it to a few hundred updates per second, and you could generally engineer around that latency... So the only power you need on the robot is enough to control the motors and sensors, and the limited onboard compute.
Edit: to clarify, my expectation is that the compute is on-location, so the latency is in the scale of <1ms more so than 10-100ms from cloud offloading
While impressive at first, this still needs depth, i.e. the movements resemble the sample, but there is a giant lack of details (information) in those movements.
They are more of a "compressed version of a common denominator for these movements". I.e. while sample walking seems more joyful and proud, aking to strutting, the robot one seems akin to stumbling.
Looking at the fighting movements - the nuances not picked up by the simulation and the robot are highly important and are what makes the punch a punch instead of a weird shove, what makes a good stance vs bad stance. Just like the walking nuances swing it from "happy" to "drunk" to "threatening", so do they for others.
While I understand the issue of compression from
"real movement -> digitally constrained simulation -> physically constrainted robot", just want to bring it up as attention to those details will probably be important to general training. While at this stage it is not that big of a deal, in any kind of real environment they will define the human-robot interaction and robot-env interaction.
Compared with other robots, it looks very impressive. Compared with living things, it looks ... well, could be better. Compared with human waltz dancers - I'd say it was a bad idea to use this as a reference.
> Compared with living things, it looks ... well, could be better.
Sometimes it's fun to be cyberpunk-contrarian, diving into all the ways our standard equipment--even "just" an arm or leg--is actually incomprehensibly complex nanotechnology that we can't even begin to match with artificial means, satisfying dozens of difficult requirements like "float in water instead of dying" or "self-lubricating with limited self-repair" or "destroys invading nanomachines."
That's impressive! Between the advancement in robotics, chat/voice AI, and image generation it's going to become so hard to distinguish fiction from reality in a few years. You'll just have to see things in person to trust they are real.
The first clip just seems like they blurred the background for that shot. Maybe because there were people walking in the background, they felt they needed to obscure them.
That Unitree G1 [1] in the video starts at $16k, which surprised me. I’m guessing end effectors drive that price way up but it feels like we are maybe a decade away from useful household humanoid robots for the price of a cheap car, which would put it well within the means of many in the developed world.
Does anyone have any insight on how realistically far away the control and programming is from that reality? A bunch of very impressive robotic control model research has been posted on HN in the last two years but as an outsider it’s hard to evaluate just how close we are to doing away with folding clothes by general purpose humanoid robot.
[1] https://www.unitree.com/mobile/g1
I'm pretty sure that $16k is aspirational at this point. See "Contact us for the real price" here: https://shop.unitree.com/products/unitree-g1
I've heard that the real price is more like $35k. And you probably want options that increase the price more. For example on one of their dog robots I believe the base price doesn't include detailed control over the limbs, so you can only use their included software to walk it around.
Still though, even the higher price is pretty good. Hard to see how China doesn't dominate humanoids given their manufacturing advantage and the sheer number of humanoid companies there now. Dozens!
Someone I know got it recently. If you're getting with the API and all the sensors it was close to 60k including customs and all of that.
I would buy that today if the battery life was not abysmal. It is though; that is, at least for me, a far larger issue than all else. We cannot make laptops or phones work an entire day; robots/drones are measured in minutes.
I think for most indoor household work they only really need enough to move from station to station. Once they are at the bed to fold clothes or in the kitchen washing dishes, they should be able to plug themselves in.
That’s way too limited for $60k though. I’d pay $20k even if it had to plug itself in for most jobs, assuming it was decent at cable management and could manage its own extension cord in other situations.
I would pay double that if it were capable of household/office/workshop reorganization and managing storage with a live inventory of where it put all the crap it cleans up. It’s not quite at the point where it would be a positive ROI financially (assuming $80k TCO over 5-10 years) but it’d make life a lot more convenient and lower the activation energy for a ton of hobbies.
What would you use it for that needs long battery life?
Shopping (which takes hours where I live), picking olives (it will take forever if you have to hop into the charger every 15 minutes), carrying my backpack on walks etc.
Are there not bigger constraints than battery life to a robot being able to do your shopping for you? It can't drive, public transit probably won't let a robot ride it, it can walk but it'll take forever. If it gets hit by a vehicle while walking, is the vehicle driver even committing a crime? Do you get reimbursed? Is the robot insurable? Do you owe the driver for damaging their car? If it gets to the store at all, will it be let in? If the store allows robots to enter, how is it supposed to pay? Robot's don't own bank accounts, so it has to prove it has the legal ability to use your account. Does this thing have Apple Pay and Google Pay integration built into it? You surely can't just give it cash or your physical credit card and hope that no one notices and steals it. Would a cashier even let a robot use a card that is supposed to be tied to a human identity? How do you authorize that? How does the robot prove it belongs to you?
For that matter, is there anything stopping a person from taking it and factory resetting? I tried to search for both "anti theft" and "theft" on that seller website and nothing came up. Maybe search is just broken for me? These are prominent features of mobile computing devices, cars, generally anything reasonably expensive you might intentionally or unintentionally leave in public unattended.
Apparently the site has an AI assistant, so I asked it. It said:
> Based on the provided context information from Unitree Technology, there is no specific mention of anti-theft features for the robots produced by Unitree Technology. The focus of the information is primarily on the various modes, functionalities, and control mechanisms of the robots, such as Zero Torque Mode, Damping Mode, Seating Mode, Ready Mode, Motion Mode, Standing Mode, Dance Mode, and Debug Mode. Additionally, the information covers services provided by the robots, including basic services, AI Sport Services, Normal Sport Services, Image Services, Network Services, and SLAM Services.
> Therefore, based on the available information, it can be concluded that there is no explicit reference to anti-theft features in the description of the robots produced by Unitree Technology. For detailed information on any anti-theft features or security measures, it is recommended to consult Unitree’s official resources or customer support for more specific details.
In typical AI fashion, that's an extremely verbose way of saying "no, there are no anti-theft features."
For shopping, surely the best robotic option are the various things the huge warehouses use, rather than havingthan a humanoid going around a store meant for humans?
Both the things that look like Roombas and all these: https://youtu.be/ssZ_8cqfBlE?si=9mCtiKKkk_N9Uk7z
No single consumer would buy all that, but the retailers can.
The most overall efficient method is robotizing the store and deliveries, yes. Which a consumer is powerless to do. But if a consumer can buy a generalized household robot, it becomes cheaper the more tasks it can perform, amortizing the cost over the total work done. A generalized household robot will become practical when it becomes cheaper than hiring staff or carers. There is most certainly a market with the elderly and others requiring assistance for a robot capable of going around stores meant for humans.
The Unitree links claim a 9000mAh battery will last it 2 hours; I don't know the voltage, but even at 5.5 V cells, that's only about 25 W average power consumption.
I find this difficult to believe, both because that's a very low power draw and because that would be penny-pinching on battery capacity.
My expectation on timescale is that genuinely general-purpose robots will need at least as much compute as a self-driving car (possibly more, that's a minimum), and have at most 1/10th the available power to do that (because they're physically smaller).
Between algorithmic improvements and Koomey's law, I think this will take at least 5 years between any given category of customer being able to afford no-steering-wheel-needed self driving cars, and the equivalent for androids.
Given the Waymos were doing geo-fenced cars with no safety drivers in the vehicles in 2017 (but still had an employee in the back with access to an emergency stop button)*, this gap is compatible with the recent press releases and youtube videos going around about robotics — but as nobody has yet started actually shipping this level of self-driving cars directly to end users**, my only guess for personal general-purpose domestic androids is: some time after 2030 as a minimum, but probably later than that.
* https://phys.org/news/2017-11-waymo-autonomous-vans-human-dr...
** No, Tesla's current one doesn't count; it will only count when it actually ships without a steering wheel or when people are actually allowed to use it while sleeping.
You misunderstand battery specs. It's a 13 string (more commonly, 13s) battery- a string is a number of cells wired in series. By convention that's a 48 volt battery, which matches with the 54 volt charger. In total that's a 421 watt-hour battery, for an average power of ~210 watts over 2 hours.
When you connect 2 9000 mAh cells in series, the resulting battery has 2x the voltage but the same mAh capacity. In parallel, the battery has the same voltage but 2x the mAh.
> My expectation on timescale is that genuinely general-purpose robots will need at least as much compute as a self-driving car (possibly more, that's a minimum), and have at most 1/10th the available power to do that (because they're physically smaller).
This seems logically flawed:
1. A car travels hundreds of miles from home. Why would a robot walk more than a while from a transport or home base? At short distances, if you truly need that power, it probably makes more sense to stream data/video over a direct low-latency connection. Long distance networking has a latency comparable to camera frame time, but even a normal wifi router can keep a lower latency than human nerve delay.
2. Humanoid robots will never be running at 85 mph past a bunch of people. They probably don't need to have the same compute throughput as a car, and definitely don't need to have the same reaction time.
Correcting the correction:
> When you connect 2 9000 mAh cells in series, the resulting battery has 2x the voltage but the same mAh capacity. In parallel, the battery has the same voltage but 2x the mAh.
The relevant units are:
* Capacity (Q, in mAh, Ah, kWh, etc)
* Power (P, in Watts)
* Voltage (U, in Volts)
* Current (I, in Amperes)
* Duration (t, in mostly measured in hours)
And the relevant formulas are:
* P = U x I or Power equals Voltage(difference) times Current
* Q = P x t or Capacity equals Power times duration
From this we can establish that connecting batteries in series or in parallel will not change their Capacity. When having 13 batteries of 29000mAh, or 29Ah, you have 13 x 29 = 377Ah or 377000mAh. Connecting batteries in series or parallel does make a difference in voltage and current: a string in series will increase the voltage while keeping the current the same (theoretically, in practice you get less than the current of the weakest cell); a parallel setup will increase the maximum current while keeping the voltage the same (again, in theory).
If you have two 1000 mAh cells connected in series, they will provide 1 amp for 1 hour. The battery is still 1000 mAh even though it is made of two cells with 2000 mAh total.
You are equating amp-hours with watt-hours, which is not reasonable. Q is charge, and is only proportional to a number of electrons. E is the energy, or a number of electrons at a voltage.
> When having 13 batteries of 29000mAh, or 29Ah, you have 13 x 29 = 377Ah or 377000mAh
By this logic, with those cells in series the whole battery would be at 48 volts and 337 amp-hours, giving it a storage of 18.1 kWh. That's despite being made of 13 cells with only 107 watt-hours each.
Thanks for the clarification, that makes a lot more sense — 210 W / 421 Wh suddenly makes it seem totally reasonable on both counts.
The only thing I don't like about car comparisons here is that, unlike cars, a household robot could have a plugged in offloaded computer. The robot could be stripped to sensors and motors, and whatever is necessary for failsafe operation, such as remaining balanced. Most other things a household robot will do could handle latency limiting it to a few hundred updates per second, and you could generally engineer around that latency... So the only power you need on the robot is enough to control the motors and sensors, and the limited onboard compute.
Edit: to clarify, my expectation is that the compute is on-location, so the latency is in the scale of <1ms more so than 10-100ms from cloud offloading
Nope the price is not “up”.
For reference the spot robot from Boston dynamics starts at 74k usd.
So you could literally have a team of humanoids for the price of a spot.
While impressive at first, this still needs depth, i.e. the movements resemble the sample, but there is a giant lack of details (information) in those movements.
They are more of a "compressed version of a common denominator for these movements". I.e. while sample walking seems more joyful and proud, aking to strutting, the robot one seems akin to stumbling.
Looking at the fighting movements - the nuances not picked up by the simulation and the robot are highly important and are what makes the punch a punch instead of a weird shove, what makes a good stance vs bad stance. Just like the walking nuances swing it from "happy" to "drunk" to "threatening", so do they for others.
While I understand the issue of compression from "real movement -> digitally constrained simulation -> physically constrainted robot", just want to bring it up as attention to those details will probably be important to general training. While at this stage it is not that big of a deal, in any kind of real environment they will define the human-robot interaction and robot-env interaction.
Otherwise great job!
Compared with other robots, it looks very impressive. Compared with living things, it looks ... well, could be better. Compared with human waltz dancers - I'd say it was a bad idea to use this as a reference.
> Compared with living things, it looks ... well, could be better.
Sometimes it's fun to be cyberpunk-contrarian, diving into all the ways our standard equipment--even "just" an arm or leg--is actually incomprehensibly complex nanotechnology that we can't even begin to match with artificial means, satisfying dozens of difficult requirements like "float in water instead of dying" or "self-lubricating with limited self-repair" or "destroys invading nanomachines."
That's impressive! Between the advancement in robotics, chat/voice AI, and image generation it's going to become so hard to distinguish fiction from reality in a few years. You'll just have to see things in person to trust they are real.
Most of the folks on modern AI/robotics paper have Asian names.
It seems Asian tiger countries may lead the next century. They are very hungry to win the solar, battery, EV, robotics race.
Impressive results.
I wonder when we'll get the equivalent of fat padding and skin nerves on humanoid robots. Would that help make them less "waddly"?
I don't understand which things are real and which are simulations.
https://youtu.be/sE4cEfhVOdE?si=FYXrtt-lvr0EEqtV&t=48
That part of the video has been heavily edited somehow, at the very least.
https://youtu.be/sE4cEfhVOdE?si=OsHygaSqG_mgjygp&t=93
Also looks fake to me.
The first clip just seems like they blurred the background for that shot. Maybe because there were people walking in the background, they felt they needed to obscure them.