

Not only is this inaccurate, it still doesn’t make sense when you’re talking about a bipedal manufacturing robot.
Like motion capture, all you need to capture from remote operation of the unit is the input articulation from the operator, which is then translated into acceptable operation movements on the unit with input from its local sensors. The majority of these things (if using pre-cap operating data) is just trained on iterative scenarios and retrained for major environmental changes. They don’t use tele-operation live because it’s inherently dangerous and takes a lot of the local sensor inputs offline for obvious reasons.
OC is saying what all Robotics Engineers have been saying about these bipedal “PR Bots” for years: the power and effort to simply make these things walk is incredibly inefficient, and makes no sense in a manufacturing setting where they will just be doing repetitive tasks over and over.
Wheels move faster than legs, single purpose mechanisms will be faster and less error-prone, and actuation takes less time to train.







I mean…there’s been plenty of people making PoCs showing Graphene isn’t really THAT secure, it’s probably just more obscure to a point. They’re pissed the cops have to work at it, but even somebody using Samsung or Google tools to properly sandbox certain data has the same capability to do so AFAIK.