Simultaneous Localization And Mapping (SLAM) is the robotics acronym for computationally constructing or updating a map of an unknown environment, while at the same time keeping track of people or objects within the map.
Wikipedia has a more indepth explanation, with links to several known algorithms such as the particle filter and extended Kalman filter.
SLAM algorithms are tailored to the available resources, hence not aimed at perfection, but at operational compliance. Published approaches are employed in self-driving cars, unmanned aerial vehicles, autonomous underwater vehicles, planetary rovers, newly emerging domestic robots and even inside the human body.Wikipedia
Golan Levin has recently posted an example of an impressive real-time Visual SLAM created inside a car without GPS.
Very impressive: real-time simultaneous localization & modeling (SLAM) on iPhone: https://t.co/ks2PvYV7ZP (via @atduskgreg
— Golan Levin (@golan) December 12, 2014
13thLab (recently Acquired by Oculus) have other examples of Densified SLAM Maps from car mounted iPhones.
Reference: Wikipedia page of Simultaneous localization and mapping (Slam)