Beginner’s guide to depth (Updated)

Skip to content

Beginner’s guide to depth (Updated)

The Beginner’s guide to depth

[…] know what depth cameras are, understand some of the differences between types, or have some idea of what it’s possible to do with a depth camera. Th is post assumes the opposite – that you know nothing and are brand new to depth. In this post, we will cover a variety of types of […]

Read More

How-To: A Simple Way to 3D Scan an Environment

3D Scanning

[…] a tablet makes the process of scanning easier, but a laptop or even desktop with a long enough USB cable could also be used. Once the camera is connected to the tablet (for this application, you must use a USB 3 cable and port), we open the software and select it within the settings window. From there, we can choose to create a new scan, and then simply select the “scan†icon to begin. As you move around, the pixels from the camera feed will turn white, yellow and green. Green means that that area is fully scanned. For a fully water tight scan, you should spend more time moving around the small detail areas and filling in any holes. When choosing a location, you can choose an outdoor space, but try and avoid extremely bright sunlight as your resulting model may look inconsistent or you may have some issues accurately scanning the space. If you are using the L515 for your scan, you should only scan indoor spaces for best results. As you can see from our scan, there were some issues with a shiny black table – this sort of material is not easy to capture with a depth camera, so whenever possible avoid anything that is very dark and reflective if you can. As you are moving through the space and doing as comprehensive scan as possible, you can also use the camera on your host device to take a high res RGB still that will be tied to the location you scanned it from – this can be used as later reference if you plan on working on your model in a 3D package like Blender or Maya. Once you are happy with the scan, tap the scan button again to finish. From here, you should optimize the model. This may take a few minutes depending on the size of your scan. Once optimized, save the scan as a .ply file. Step 2: Converting to OBJ. PLY files are a point cloud file, and can be used as is, for example, by uploading your PLY to sketchfab you can share the scan or view it in VR. However, if you want a bit more flexibility as to what you do with the file since PLY files are quite large, you may wish to convert it to a mesh. For this part of the process we are going to use Meshlab to convert the PLY to an OBJ. Open Meshlab, and go to File> Import Mesh. Import the PLY file you exported in the last step. Meshlab showing imported mesh. From here, we’re going to run a few operations to clean up the point cloud and turn it into a mesh. Depending on what your file looks like, you may want to spend some time playing around with the settings and deleting any extraneous vertices you didn’t want. Use the “Select Vertices†button on the top of the toolbar, and then select groups of vertices to delete using the “Delete the current set of selected vertices†tool. Cleaned up mesh with the two tools highlighted. Next, go to Filters> Sampling  > Poisson disk sampling > On the settings, make sure you have “Base Mesh Subsampling†selected, and change the number of samples to tens of thousands – I chose 35,000. The higher the number here, the more refined your final mesh will be. Try not to go too high though, the number of triangles will affect how your final mesh operates in other programs and applications. In the layer menu to the right, you should now see your original point cloud and the poisson sample. Delete the original mesh – we don’t need it anymore. For the next step, go to Filters> Point set > Compute normal for point set. Change the neighbor number to 16 and run. This is trying to automatically determine which way each face will point. Now choose Filters> Remeshing, Simplification and Reconstruction> Surface Reconstruction: Ball Pivoting. Click the up arrow on the world unit box by the “Pivoting Ball Radius†once – it should autofill an appropriate value. Apply, and we should now have a mesh instead of a point cloud. If you don’t like the resulting mesh, you can go back and repeat these steps with slightly different parameters. So that we can take the color information we have along with us when we export from Meshlab, do the following steps. Run Filters>Texture>Parametrization: Trivial Per triangle. If you get an error, change the Inter-Triangle border value to 1. Next, Run Filters>Texture>Transfer Vertex color to texture. You will be asked to save the project at this stage, do so. Use the suggested name to save the texture file […]

Read More

Saving lives with Everdrone and Intel® RealSense™ Technology

Everdrone with Intel RealSense depth cameras

Hundreds of thousands of people experience an out-of-hospital cardiac arrest every year. In Europe alone, that number is around 275,000. In the USA, more than 350,000 cardiac arrests occur outside of the hospital each year. While emergency responders do everything they can to respond to these life threatening situations as rapidly as possible, survival […]

Read More

Turning homes into digital Smart Health Homes

[…] shifting, with a large cohort of retirees who want to stay in their own homes as long as possible. Meeting the increased demand for in-home health care is a challenge that Electronic Caregiver® aims to solve. Their solution, Addison Careâ„¢ is the technology that powers the company’s 3D, AI and voice-driven connected Virtual Caregiverâ„¢.  […]

Read More

Which Intel RealSense device is right for you? (Updated June 2020)

Intel RealSense Stereo Depth and Tracking Devices

[…] depth and tracking hardware, or an experienced professional, it can still be challenging to determine which of the many Intel RealSense Products we have available are right for your project. In this post, we’re going to discuss the Intel RealSense depth cameras D415, D435 and D435i, as well as the Intel RealSense tracking camera […]

Read More
Scroll To Top