| ▲ | MrVitaliy 12 hours ago | |
Anyone tried using lidar and just cut/measure distance to the object? | ||
| ▲ | rcxdude 9 hours ago | parent | next [-] | |
Apparently they used something similar for production on avatar: stereo cameras for depth estimation which allowed realtime depth composition of CG characters onto the shots they were taking, which makes it a lot easier to get everyone on the same page about the scene, especially with characters that are outside normal human proportions. But it wasn't good enough for the final shots. | ||
| ▲ | wizzledonker 12 hours ago | parent | prev | next [-] | |
That would require calibration with the camera, and even then the camera and lidar sensor can’t be in exactly the same place. I doubt results would be better. | ||
| ▲ | summarity 12 hours ago | parent | prev | next [-] | |
Well sort of, the industry tried to go way beyond that by capturing the entire light field: https://techcrunch.com/2016/04/11/lytro-cinema-is-giving-fil... | ||
| ▲ | dgently7 7 hours ago | parent | prev [-] | |
per pixel depth does not solve for semi-transparency. | ||