1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.
10 years of news and resources for members of the IEEE Signal Processing Society
For our September 2018 issue, we cover recent patents granted in the area of Simultaneous localization and mapping (SLAM), both from algorithm and hardware development sides.
The invention no. 9,886,037 is related to methods and apparatus that use a visual sensor and dead reckoning sensors to process Simultaneous Localization and Mapping (SLAM). These techniques can be used in robot navigation. Advantageously, such visual techniques can be used to autonomously generate and update a map. Unlike with laser rangefinders, the visual techniques are economically practical in a wide range of applications and can be used in relatively dynamic environments, such as environments in which people move. One embodiment further advantageously uses multiple particles to maintain multiple hypotheses with respect to localization and mapping. Further advantageously, one embodiment maintains the particles in a relatively computationally-efficient manner, thereby permitting the SLAM processes to be performed in software using relatively inexpensive microprocessor-based computer systems.
In patent no. 9,846,042 an indoor localization system uses Visual Simultaneous Localization and Mapping (VSLAM) aided by gyroscope sensor information. Indoor environments pose several challenges which could cause a vision only system to fail due to tracking errors. Investigation revealed significant feature loss in a vision only system when traversing plain walls, windows and staircases. However, the addition of a gyroscope helps in handling such difficult conditions by providing additional rotational information. A portable system consisting of an Inertial Measurement Unit (IMU) and a stereo camera has been developed for indoor mapping. The images and gyroscope rates acquired by the system are stored and post-processed using Gyroscope Assisted Scalable Visual Simultaneous Localization and Mapping Algorithm.
Patent no. 9,807,365 proposes a method and system for registering data by first acquiring the data from a scene by a sensor at different viewpoints, and extracting, from the data, three-dimensional (3D) points and 3D lines, and descriptors associated with the 3D points and the 3D lines. A first set of primitives represented in a first coordinate system of the sensor is selected, wherein the first set of primitives includes at least three 3D points. A second set of primitives represented in a second coordinate system is selected, wherein the second set of primitives includes any combination of 3D points and 3D lines to obtain at least three primitives. Then, using the first set of primitives and the second set of primitives, the 3D points are registered with each other, and the 3D points with the 3D lines to obtain registered primitives, wherein the registered primitives are used in a simultaneous localization and mapping (SLAM) system.
In patent no. 9,678,509 techniques are presented that optimize performance of simultaneous localization and mapping (SLAM) processes for mobile devices, typically a mobile robot. In one embodiment, erroneous particles are introduced to the particle filtering process of localization. Monitoring the weights of the erroneous particles relative to the particles maintained for SLAM provides a verification that the robot is localized and detection that it is no longer localized. In another embodiment, cell-based grid mapping of a mobile robot's environment also monitors cells for changes in their probability of occupancy. Cells with a changing occupancy probability are marked as dynamic and updating of such cells to the map is suspended or modified until their individual occupancy probabilities have stabilized. In another embodiment, mapping is suspended when it is determined that the device is acquiring data regarding its physical environment in such a way that use of the data for mapping will incorporate distortions into the map, as for example when the robotic device is tilted.
In patent no. 9,674,507 disclosed are a system, apparatus, and method for monocular visual simultaneous localization and mapping that handles general 6DOF and panorama camera movements. A 3D map of an environment containing features with finite or infinite depth observed in regular or panorama keyframes is received. The camera is tracked in 6DOF from finite, infinite, or mixed feature sets. Upon detection of a panorama camera movement towards unmapped scene regions, a reference panorama keyframe with infinite features is created and inserted into the 3D map. When panoramic camera movement extends toward unmapped scene regions, the reference keyframe is extended with further dependent panorama keyframes. Panorama keyframes are robustly localized in 6DOF with respect to finite 3D map features. Localized panorama keyframes contain 2D observations of infinite map features that are matched with 2D observations in other localized keyframes. 2D-2D correspondences are triangulated, resulting in new finite 3D map features.
Patent no. 9,651,388 introduces a system and method for simultaneous localization and mapping (SLAM), comprising an improved Geometric Dilution of Precision (GDOP) calculation, a reduced set of feature landmarks, the use of Inertial Measurement Units (IMU) to detect measurement motion, and the use of one-time use features and absolute reference landmarks.
In patent no. 9,576,183 apparatuses and methods for fast visual simultaneous localization and mapping are described. In one embodiment, a three-dimensional (3D) target is initialized immediately from a first reference image and prior to processing a subsequent image. In one embodiment, one or more subsequent reference images are processed, and the 3D target is tracked in six degrees of freedom. In one embodiment, the 3D target is refined based on the processed the one or more subsequent images.
As introduced in patent no. 9,534,899 Vector Field SLAM is a method for localizing a mobile robot in an unknown environment from continuous signals such as WiFi or active beacons. Disclosed is a technique for localizing a robot in relatively large and/or disparate areas. This is achieved by using and managing more signal sources for covering the larger area. One feature analyzes the complexity of Vector Field SLAM with respect to area size and number of signals and then describe an approximation that decouples the localization map in order to keep memory and run-time requirements low. A tracking method for re-localizing the robot in the areas already mapped is also disclosed. This allows to resume the robot after is has been paused or kidnapped, such as picked up and moved by a user. Embodiments of the invention can comprise commercial low-cost products including robots for the autonomous cleaning of floors.
If you have an interesting patent to share when we next feature patents related to SLAM, or if you are especially interested in a signal processing research field that you would want to be highlighted in this section, please send email to Csaba Benedek (benedek.csaba AT sztaki DOT mta DOT hu).
Title: Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system
Inventors: Karlsson; L. Niklas (Pasadena, CA), Pirjanian; Paolo (Glendale, CA), Goncalves; Luis Filipe Domingues (Pasadena, CA), Di Bernardo; Enrico (Padua, IT)
Issued: February 6, 2018
Assignee: iRobot Corporation Bedford, MA, US
Title: Gyroscope assisted scalable visual simultaneous localization and mapping
Inventors: Babu; Benzun Wisely (Worcester, MA), Cyganski; David (Holden, MA), Duckworth; R. James (Worcester, MA)
Issued: December 19, 2017
Assignee: Worcester Polytechnic Institute (Worcester, MA)
Title: System and method for hybrid simultaneous localization and mapping of 2D and 3D data acquired by sensors from a 3D scene
Inventors: Cansizoglu; Esra (Malden, MA), Taguchi; Yuichi (Arlington, MA), Ramalingam; Srikumar (Cambridge, MA)
Issued: October 31, 2017
Assignee: Mitsubishi Electric Research Laboratories, Inc. (Cambridge, MA)
Title: Method and apparatus for simultaneous localization and mapping of mobile robot environment
Inventors: Sofman; Boris (Pittsburgh, PA), Ermakov; Vladimir (Santa Clara, CA), Emmerich; Mark (San Jose, CA), Alexander; Steven (Fremont, CA), Monson; Nathaniel David (Mountain View, CA)
Issued: June 13, 2017
Assignee: Neato Robotics, Inc. (Newark, CA)
Title: Monocular visual SLAM with general and panorama camera movements
Inventors: Pirchheim; Christian (Graz, AT), Schmalstieg; Dieter (Graz, AT), Reitmayr; Gerhard (Vienna, AT)
Issued: June 6, 2017
Assignee: QUALCOMM Incorporated (San Diego, CA)
Title: System and method for improved simultaneous localization and mapping
Inventors: Chapman; Mark D. (Central City, IA), Hamilton; John S. (Vinton, IA), Hight; Dalayr W. (Cedar Rapids, IA)
Issued: May 16, 2017
Assignee: Rockwell Collins, Inc. (Cedar Rapids, IA)
Title: Fast initialization for monocular visual SLAM
Inventors: Reitmayr; Gerhard (Graz, AT), Mulloni; Alessandro (Vienna, AT)
Issued: February 21, 2017
Assignee: QUALCOMM Incorporated (San Diego, CA)
Title: Re-localization of a robot for slam
Inventors: Gutmann; Jens-Steffen (Pasadena, CA), Fong; Philip (Los Angeles, CA), Munich; Mario E. (Sierra Madre, CA)
Issued: January 3, 2017
Assignee: iRobot Corporation (Bedford, MA)
|Nominations Open for 2020 SPS Awards||1 September 2020|
|Call for Nominations: Awards Board and Nominations and Appointments Committee||25 September 2020|
|Call for Nominations: Fellow Evaluation Committee||30 September 2020|
|Election of Regional Directors-at-Large and Members-at-Large||1 October 2020|
|Meet the 2020 Candidates: IEEE President-Elect and Division IX Director-Elect||1 October 2020|
|Call for Nominations: SPS Chapter of the Year Award||15 October 2020|
© Copyright 2020 IEEE – All rights reserved. Use of this website signifies your agreement to the IEEE Terms and Conditions.
A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity.