It would perhaps be more accurate to say that this is a big leap forward compared to most existing off-the-shelf depth cameras for robotics. To address the iPhone specifically: you probably aren't going to mount iPhones on a bunch of production robots in the field.
Comparing to other alternatives in the robotics space (I've listed RealSense and Structure above, but there are others), there is somewhat of a laundry list of potential pitfalls and issues that we've seen folks trip over again and again.
Calibration is a big one, and a large part of what we're doing with HiFi is launching it with it's own automatic, self-calibration process (no fiducials). There are some device failures that a process like this wouldn't be able to handle, but the vast majority of calibration problems in the field result from difficult tooling or requirements, a need to supply one's own calibration software, or a combination of hardware and software that make the process difficult. A nickel for every time someone has to train a part-time operator to fix calibration in the field, and I'd own Amazon.
Depth quality and precision is another big pitfall — there are folks out there today using RealSense for their robot, but we've talked to a number of folks who just don't rely on the on-board depth. It's too noisy, it warps flat surfaces, etc. Lots of little details that on the surface you might not think about when just looking at a list of cameras! Putting our edge AI capabilities aside, the improved optics and compute available on the HiFi allow us to build a sensor that always provides good depth. That sounds like a baseline for this kind of tech, but there's plenty of examples otherwise on the market today!
Software is probably the other last big thing that we really want to leap forward on. We don't have too much to say about our SDK today, but when we launch it we hope to make working with these sensors a lot easier. I work with RealSense quite a bit (I am the maintainer of realsense-rust), and quite honestly what has been a solid overall hardware package for many years (until HiFi, I hope) is let down by how confusing it is to use librealsense2 in any meaningful project.
Needless to say, I think HiFi stands on some solid merits and I'm not sure it can be directly compared to other 3D sensors in e.g. iPhones, mostly because the expected use-case is so utterly different.
Appreciate the detailed response! Definitely seems like we've come a long way from when i heard about people using Kinect cameras, and look forward to all future advancements that you will contribute!
Comparing to other alternatives in the robotics space (I've listed RealSense and Structure above, but there are others), there is somewhat of a laundry list of potential pitfalls and issues that we've seen folks trip over again and again.
Calibration is a big one, and a large part of what we're doing with HiFi is launching it with it's own automatic, self-calibration process (no fiducials). There are some device failures that a process like this wouldn't be able to handle, but the vast majority of calibration problems in the field result from difficult tooling or requirements, a need to supply one's own calibration software, or a combination of hardware and software that make the process difficult. A nickel for every time someone has to train a part-time operator to fix calibration in the field, and I'd own Amazon.
Depth quality and precision is another big pitfall — there are folks out there today using RealSense for their robot, but we've talked to a number of folks who just don't rely on the on-board depth. It's too noisy, it warps flat surfaces, etc. Lots of little details that on the surface you might not think about when just looking at a list of cameras! Putting our edge AI capabilities aside, the improved optics and compute available on the HiFi allow us to build a sensor that always provides good depth. That sounds like a baseline for this kind of tech, but there's plenty of examples otherwise on the market today!
Software is probably the other last big thing that we really want to leap forward on. We don't have too much to say about our SDK today, but when we launch it we hope to make working with these sensors a lot easier. I work with RealSense quite a bit (I am the maintainer of realsense-rust), and quite honestly what has been a solid overall hardware package for many years (until HiFi, I hope) is let down by how confusing it is to use librealsense2 in any meaningful project.
Needless to say, I think HiFi stands on some solid merits and I'm not sure it can be directly compared to other 3D sensors in e.g. iPhones, mostly because the expected use-case is so utterly different.