Intel® RealSense™ Tracking Camera T265 Buy Tracking Redefined Introducing a new class of stand‑alone simultaneous localization and mapping device, the Intel® RealSense™ Tracking Camera T265 for use in robotics, drones and more. Intel® RealSense™ Tracking Camera T265 With its small form factor and low power consumption, the Intel® RealSense™ Tracking Camera T265 has been designed to give you the tracking performance you want straight off‑the‑shelf. Cross‑platform, developer friendly simultaneous localization and mapping for all your robotics, drone and augmented reality rapid prototyping needs. Buy Low power. Featuring highly optimized proprietary V‑SLAM algorithms running directly on the device, it operates at an incredible 1.5W. Precision tracking. Extensively tested and validated for performance, providing under 1%1 closed loop drift under intended use conditions. Small and Light. At 108 x 25 x 13 mm in size and weighing only 55 g, this device won’t weigh your prototype down. Tracking solution What is SLAM? SLAM, or Simultaneous Localization and Mapping, is a computational problem – how does a device construct or update a map of an unknown environment while simultaneously keeping track of it’s own location within that environment? Before the days of GPS, sailors would navigate by the stars, using their movements and positions to successfully find their way across oceans. V‑SLAM uses a combination of cameras and Inertial Measurement Units (IMU) to navigate in a similar way, using visual features in the environment to track it’s way around even unknown spaces with accuracy. Inside the camera The Intel® RealSense™ Tracking Camera T265 includes two fisheye lens sensors, an IMU and an Intel® Movidius™ Myriad™ 2 VPU. All of the V‑SLAM algorithms run directly on the VPU, allowing for very low latency and extremely efficient power consumption. The T265 has been extensively tested and validated for performance, providing under 1% closed loop drift under intended use conditions. It also offers sub 6ms latency between movement and reflection of movement in the pose. This is fast enough for even highly‑sensitive applications such as Augmented and Virtual Reality. Example use case Robotics navigation, occupancy mapping and collision avoidance demo. Play Video Example use case Robotics navigation, occupancy mapping and collision avoidance demo. Play Video Simple and versatile prototyping For developers working on a robotics, drone or augmented reality systems, SLAM can be challenging to implement – requiring significant time and resources in order to add valuable environmental understanding. With the T265, developers can now have precise and robust tracking that has been extensively tested in a variety of conditions and environments. This self‑contained tracking system is designed for simple integration. There’s no need to re‑design your board, simply plug in the provided USB cable and start streaming pose data straight away. The T265 also features an easy mounting solution, with standardized mounting sockets on the rear of the camera. Part of the family While there are many use cases for a stand‑alone T265, it is definitely a part of the Intel® RealSense™ Technology family, and has been designed to work flawlessly along‑side our other devices. The T265 features an infrared cut filter over the lenses, allowing it to ignore the projected patterns from our D400 series depth cameras. This means that developers can utilize both devices together with ease for advanced applications such as occupancy mapping or collision avoidance and navigation in locations where GPS data isn’t available. Code samples Get your project off the ground quickly with help from our code examples and tutorials. Learn more Case study Robust Visual-Inertial Tracking from a Camera that Knows Where it’s Going Read the full story Tech Specs Datasheet V‑SLAM, part of Intel® RealSense™ Technology High precision Visual Inertial Odometry Simultaneous Localization and Mapping algorithms. Intel® Movidius™ Myriad™ 2.0 VPU Visual Processing Unit optimized to run V‑SLAM at low power. Two Fisheye lenses with combined 163±5° FOV The camera includes two OV9282 imagers with fisheye lenses for a combined, close to hemispherical 163±5° field of view for robust tracking even with fast motion. BMI055 IMU The Inertial Measurement Unit, allows for accurate measurement of rotation and acceleration of the device, to feed into the V‑SLAM algorithms. USB 3.1 Gen 1 Micro B USB 2.0 and USB 3.1 supported for either pure pose data or a combination of pose and images. 108 x 24.5 x 12.5 mm Small form factor designed to mount on any device with ease. 2 x M3 0.5 mm pitch mounting sockets Securely attach the camera to your device with these standard mounting points on the rear of the camera. Frequently Asked Questions Is this a depth camera? Expand Intel® RealSense™ T265 is not a depth camera. It features two fisheye lenses for feature detection, but does not compute dense depth. It is possible to use the image feed from the cameras to compute dense depth, though the results will be poor compared to other Intel® RealSense™ depth cameras as lenses are optimized for wide tracking field of view rather than depth precision, and there is no texture projected onto the environment to aid with depth fill. Can I run SLAM on Intel RealSense Depth cameras? Why would I need T265? Expand All SLAM solutions, and there are many good ones, are limited by the information they receive. It is possible to run host-based SLAM using our D400 series depth cameras – ideally the D435i, however these cameras are optimized for depth accuracy at the expense of field of view – D400 cannot see as much of the world as T265. As such, a SLAM solution based on the information from a single D400 camera will get lost in certain situations when T265 will not. Furthermore, the T265 is optimized for power and latency using its embedded VPU. For these reasons, T265 will succeed in many use cases where D400 based SLAM will fail. Additionally, when using the T265, no additional resources are required on the platform in terms of compute to perform the SLAM algorithms, which means that tracking with T265 is platform independent, has a low integration cost, and can run on very low compute devices. For some use cases, SLAM on the D435i will be ideal, but for the highest quality tracking, choose the T265. For both quality depth and tracking, use the D415 and the T265 in parallel. What platforms does the T265 support? Expand At launch T265 includes support for Windows and Ubuntu. The Host API library is open sourced so customers can port it to any platform they want. It has been run on Android successfully. The FW that runs inside T265 is independent of host platform, so the performance of T265 is also host independent. What are the hardware requirements? Expand Since Intel RealSense T265 computes all tracking data on device, the only hardware requirements are a USB 2 or USB 3 connection that provides 1.5W of power, along with enough memory and compute to boot the T265 and receive the pose data and use it in your application. Can custom algorithms be added to the Intel® Movidius™ Myriad™ 2? Expand The optimization of the algorithm on the VPU prevents it from being shared with other code or uses. Furthermore the thermal envelope of T265 is only large enough for the current workload. What is wheel odometry and how does it help T265 navigate? Expand Wheel odometry is the use of sensors to measure how much a wheel turns, it can be used to estimate changes in a wheeled robots position. T265 has wheel odometry support built in, allowing it to use the data from these sensors to refine the position data of the robot. Providing robotic wheel odometer or velocimeter data over USB to TM2 will make the tracking much more robust on wheeled robots, which otherwise can experience many tracking failures. We consider odometer input to be a requirement for robust tracking on wheeled robots. Can T265 re-localize after being kidnapped? Expand Intel® RealSense™ T265 can re-localize after kidnapping, providing there are some features in view which are in its internal map. In the case of a completely new environment, it will continue to provide relative pose data until absolute position can be re-established. Can T265 work indoors and outdoors? Expand Yes, T265 can work both indoors and outdoors. Just like a person, it can be blinded by light that shines straight into its eyes, and it can’t see in absolute darkness. Can T265 work in low light conditions? Expand While well-lit environments are preferable, T265 performs well at light levels as low as 15 lux, and depending on the exact structure of the light can sometimes continue to work at even lower light levels. What is the ideal operating environment for T265?2 Expand An ideal operating environment for the T265 has a reasonable number of fixed, distinct visual features in view. It will perform poorly if the entire field of view contains moving, near field objects such as people. In cases where crowds are expected indoors, it is advised to point the camera upwards, where it can use features on the ceilings to navigate. Do multiple T265 devices interfere with each other? Expand No, you can use as many T265 devices in a space as you like. Start developing with Intel RealSense SDK 2.0 Our open‑source SDK 2.0 offers a variety of wrappers supporting popular programming languages and platforms. Visit developer center Stay connected! Get the latest Intel RealSense news, product updates, event and webinar notifications and more. Subscribe now Computer vision news Stay informed about the latest trends and products in the computer vision industry. Read the blog Getting started is fast and easy Our How-Tos and video tutorials cover everything from basics to advanced techniques. Learn more 1. Under 1% drift observed in repeated testing in multiple use cases and environments. AR/VR use cases were tested with the T265 mounted on the head in indoor living and office areas with typical indoor lighting including sunlight entering the room. Wheeled robot use cases tested with wheel odometer data integrated, again in indoor office and home environments. 2. Sufficient visibility of static tracked visual features is required, the device will not work in smoke, fog, or other conditions where the camera is unable to observe visual reference points. Get started Now that you’ve got your Intel® RealSense™ tracking camera, start using it by following this easy to use guide. Discover Store Ready to get started? Buy your Intel RealSense devices here. Fast worldwide shipping directly from Intel. Buy Docs Make the most out of your Intel RealSense devices with extensive documentation and whitepapers. Read more Support Running into problems? Have questions? Get help and support from our experts and community members here. Get help Subscribe here to receive updates about our latest blog posts and other news. Let’s talk about how Intel RealSense computer vision products can enhance your solution. Country/Region * Afghanistan Aland Islands Albania Algeria American Samoa Andorra Angola Anguilla Antarctica Antigua and Barbuda Argentina Armenia Aruba Australia Austria Azerbaijan Bahamas Bahrain Bangladesh Barbados Belarus Belgium Belize Benin Bermuda Bhutan Bolivia Bonaire Bosnia and Herzegovina Botswana Bouvet Island Brazil British Indian Ocean Territory Brunei Darussalam Bulgaria Burkina Faso Burundi Cambodia Cameroon Canada Cape Verde Cayman Islands Central African Republic Chad Chile China Christmas Island Cocos (Keeling) Islands Colombia Comoros Congo Congo, The Democratic Republic of the Cook Islands Costa Rica Cote D'ivoire Croatia Cuba Curacao Cyprus Czech Republic Denmark Djibouti Dominica Dominican Republic Ecuador Egypt El Salvador Equatorial Guinea Eritrea Estonia Ethiopia Falkland Islands (Malvinas) Faroe Islands Fiji Finland France French Guiana French Poly. French S. Terr. Gabon Gambia Georgia Germany Ghana Gibraltar Greece Greenland Grenada Guadeloupe Guam Guatemala Guernsey Guinea Guinea-Bissau Guyana Haiti Heard Island and Mcdonald Islands Holy See (Vatican City State) Honduras Hong Kong Hungary Iceland India Indonesia Iran, Islamic Republic of Iraq Ireland Isle of Man Israel Italy Jamaica Japan Jersey Jordan Kazakhstan Kenya Kiribati Korea, Democratic People's Republic of Korea, Republic of Kuwait Kyrgyzstan Lao People's Democratic Republic Latvia Lebanon Lesotho Liberia Libya Liechtenstein Lithuania Luxembourg Macau Macedonia Madagascar Malawi Malaysia Maldives Mali Malta Marshall Islands Martinique Mauritania Mauritius Mayotte Mexico Micronesia, Federated States of Moldova, Republic of Monaco Mongolia Montenegro Montserrat Morocco Mozambique Myanmar Namibia Nauru Nepal Netherlands Netherlands Antilles New Caledonia New Zealand Nicaragua Niger Nigeria Niue Norfolk Island Northern Mariana Islands Norway Oman Pakistan Palau Palestine, State Panama Papua New Guinea Paraguay Peru Philippines Pitcairn Poland Portugal Puerto Rico Qatar Reunion Romania Russian Federation Rwanda Saint Barthelemy Saint Helena Saint Kitts and Nevis Saint Lucia Saint Martin Saint Pierre Saint Vincent and The Grenadines Samoa San Marino Sao Tome and Principe Saudi Arabia Senegal Serbia Serbia and Montenegro Seychelles Sierra Leone Singapore Sint Maarten Slovakia Slovenia Solomon Islands Somalia South Africa South Georgia and The South Sandwich Islands South Sudan Spain Sri Lanka Sudan Suriname Svalbard Swaziland Sweden Switzerland Syrian Arab Republic Taiwan Tajikistan Tanzania, United Republic of Thailand Timor-Leste Togo Tokelau Tonga Trinidad and Tobago Tunisia Turkey Turkmenistan Turks and Caicos Islands Tuvalu Uganda Ukraine United Arab Emirates United Kingdom United States United States Minor Outlying Islands Uruguay Uzbekistan Vanuatu Venezuela Vietnam Virgin Islands, British Virgin Islands, U.S. Wallis and Futuna Western Sahara Yemen Zambia Zimbabwe Company Size * <100 Employees 100-1000 Employees 1000-5000 Employees 5000+ Employees State * Alaska Alabama Arizona Arkansas California Colorado Connecticut District Of Columbia Delaware Florida Georgia Hawaii Iowa Idaho Illinois Indiana Kansas Kentucky Louisiana Massachusetts Maryland Maine Michigan Minnesota Missouri Mississippi Montana North Carolina North Dakota Nebraska New Hampshire New Jersey New Mexico Nevada New York Ohio Oklahoma Oregon Pennsylvania Rhode Island South Carolina South Dakota Tennessee Texas Utah Virginia Vermont Washington Wisconsin West Virginia Wyoming Which Intel RealSense product are you interested in? * Need advice Intel® RealSense™ ID Solution F450 Intel® RealSense™ ID Solution F455 Intel® RealSense™ Touchless Control Software Intel® RealSense™ LiDAR Camera L515 Intel® RealSense™ D415 depth camera Intel® RealSense™ D435 depth camera Intel® RealSense™ D435i depth camera Intel® RealSense™ SR305 depth camera Intel® RealSense™ T265 tracking camera Intel® RealSense™ T261 tracking module Intel® RealSense™ D430 depth module Intel® RealSense™ D420 depth module Intel® RealSense™ D415 depth module Intel® RealSense™ D410 depth module Intel® RealSense™ D400 depth module Intel® RealSense™ SR300 depth module What application are you planning bring to market? * Smart lock Gate access control ATM POS Drone Scanning Device Consumer robot Commercial robot Vacuum robot Digital signage or wall-mounted display Kiosk or vending machine Client device (personal computer, tablet, smartphone, etc.) VR/AR Other (please describe) Please describe: Estimated Quantity in the next 12 months * Greater than 1,000,000 units 100,000 – 1,000,000 units 10,000 – 100,000 units 1,000 - 10,000 units 100 - 1,000 units Fewer than 100 units Which capabilities are you planning to or currently working with (Select the top three that apply)? Object scan Room scan Collision avoidance Measurement Facial authentication Gesture control Skeleton tracking 3D scan Navigation Other (please describe) Please describe: Yes, I would like to subscribe to stay connected to the latest Intel technologies and industry trends by email and telephone. I can unsubscribe at any time. By submitting this form, you are confirming you are an adult 18 years or older and you agree to share your personal information with Intel to use for this business request. You also agree to subscribe to stay connected to the latest Intel technologies and industry trends by email and telephone. You may unsubscribe at any time. Intel’s web sites and communications are subject to our Privacy Notice and Terms of Use. errors Thank you!We'll be in touch soon.