SLAMcore was formed by a team of five co-founders who all worked at Imperial College, brought together by the world-renowned Computer Vision expert Professor Andrew Davison (SLAMcore Director and Chair). The team have a strong mix of both academic excellence (20 + years) and industrial experience (15 + years). Last year SLAMcore officially became an Imperial College spin-out and they hit the road looking for funding, so that they could focus on SLAMcore full-time. This led them to collaborate with the Toyota Research Institute (TRI), an organisation set up by Toyota to push forward interesting technology in North America, through connections at an academic and start-up level. Before long, Toyota Research Institute invested into SLAMcore and brought in several other investors, including Sparks in Japan and Amadeus Capital.
SLAMcore specialises in the algorithms that take information from sensors and turns it into an accurate position within a 3D space. This creates two outputs: a 3D geometrically consistent map and a position within that map. This process is called simultaneous localisation mapping, which means that you are able to start with nothing and use a platform with onboard sensors to accurately locate itself relative to the world around it in real time. Previously, you could not know where you were without an accurate map to compare your position against, but you couldn’t create this map without knowing accurately where you were. However, 20 years ago, Davison showed that you could solve this problem with a single camera, as long as it was moving, due to a fundamental principle known as parallax. This allows researchers to see how objects are moving relative to each other. For example, things close to you seem to move faster than things further away, a fact that can be exploited to estimate the relative distance of objects.
“We are looking to provide this solution and already have a significant amount of people working in developing algorithms that can solve this problem quickly and efficiently, with a real focus on performance on affordable hardware” says Nicholson. “There are a number of solutions out there, but they will all fail in certain circumstances. When it comes to cars, failure is not an option. This is an issue that will really hold back the industry if we cannot overcome it.” Sensor technology has been one of the main issues within the development of autonomous vehicles, through both the performance in difficult conditions and the cost of the specialist hardware currently required. SLAMcore has developed systems with various sensors and pioneered a solution with a new type of sensor called the Event Camera, which has the potential to address many of the existing problems . “We can not rely on solutions based on a single type of sensor as there will always be cases where an individual one fails.” Nicholson adds. “The answer is to use various sensors and stitch the information together to make a single version of the truth, through sophisticated probabilistic modelling.”
“We have been pioneering localisation and mapping solutions with a new type of sensor called the Event Camera that has similarities to a normal camera, but processes the light it captures in a very different way.” Nicholson said. A standard camera on has an array of pixels that are exposed to light anything from 10 to 1000 times a second. With the Event Camera, instead of exposing all of the pixels to light at once, the individual pixels in the camera are completely independent, measuring the amount of light they are receiving and only transmitting information when there is a change. “This is similar to how our eyes work,” Nicholson explains, “with no data being generated unless there is movement, eliminating a flood of useless data that is found with a normal camera that continues to transmit regardless of movement. They also provides additional benefits in difficult lighting conditions or when the platform is moving at high speed.”
The problem that other automakers such as Tesla have faced was that, with a bright light and then a dark object in the same scene, standard camera’s were unable to see what was going on, which led to accidents. To overcome this, one can set a very low exposure to be able to capture the information in the light areas, but this meant that dark areas were under exposed and information lost. On the other hand, developers could then set exposure to record dark areas, but this then meant the light areas were over exposed. As every pixel is independent on the Event Camera, information can be processed even if there are light and dark objects in the same scene. Nicholson also explained that if the frame rate for a standard camera is too slow, each picture will start to blur into the next one, which means that you cannot use traditional computer vision because of motion blur. “With the Event Camera, you can easily detect much faster motion such as a vibrating guitar string or a desk fan spinning at full speed.” Longer term, the Event Camera could deliver a much lower powered solutions. “As data is only generated with movement, there is a huge potential for power savings.”
SLAMcore’s long-term business model looks at the company becoming a major player in the automotive localisation and mapping industry. Nicholson explains that the company has to prove its technology and get it running as efficiently as possible in order to implement the software within autonomous vehicles. “Due to regulations, you need to have extremely high performance with no bugs or failures with the product,” he says. “Mapping and localisation in real-time with autonomous cars, using sensors like ours is critical. In the short to medium term, you will not be able to avoid having a number of sensors around the vehicle, but what we can do is make sure that you don't have to buy ridiculously expensive hardware.” SLAMcore technology works with most camera systems, even a cheap webcam, and the Inertial Measurement Units (IMU’s) are the same as the ones you find in your phone. The Event Camera is still an expensive sensor but when manufacturing is scaled up it has the potential to be a similar price to standard cameras. Ultimately this sensor combination could be a much cheaper alternative to LIDAR systems.
The UK punches well above its weight in terms of quality of academic work in the science and technology fields. However, Nicholson tells me that we are not always so good at turning these ideas into businesses. This is where Toyota Research Institute comes in, being a global business that knows how to grow, develop and lead. “The relationship with Toyota sprouted at the academic level, something that is now becoming common as companies are starting to appreciate how important it is to have good academic minds working within their corporation,” he says. The Toyota Research Institute brought Professor John Leonard from the Massachusetts Institute of Technology (MIT) onto their team, one of the leading figures in computer vision, specifically for cars. Leonard and Davison had been colleagues from their work together in academia and, Nicholson tells me, had a lot of respect for each other’s research, which contributed to the collaboration with TRI. Off the back of this relationship, Jim Adler, Vice President at TRI, came over to London to organise the deal and agree how they would work with SLAMcore moving forwards.
“Toyota, and the automotive industry in general, are very good at getting software working from one company working on another’s hardware” Nicholson stresses. “This is essential for an algorithm company like us who are trying to get software onto hardware, which is why I think robotics is such an interesting - and difficult - space. A lot of start-ups are approaching it from the same standpoint as an application or service but as soon as things leave the computer and start being put onto real hardware, such as a car, things become so much more expensive and complicated. For example, if you look at the supply chain for a vehicle, there are hundreds of components from different companies that go into a single car, so it is important that we have the expertise of an automaker to help us know how to navigate the complex landscape, who to talk to, how to protect ourselves and how to use our money as efficiently as possible.” Through the partnership, TRI has allowed SLAMcore the freedom to act independently, at the same time, has provided the appropriate guidance and support when required. This is a prime example of how collaboration can benefit both sides of the fence.
“In the future, we see ourselves being the 'go to' company for any OEM or integrator that is looking to have a product which moves autonomously or needs to project information in real-time. If you want to have the best performing products then you will need the best possible algorithms, and that's what we aim to provide,” Nicholson adds. “We don't want to make a car or a drone, we want to provide solutions to all of the people that work in that space and feed into the wider supply chain. Long term we are working with processor companies to have as much of our algorithms as possible running in silicon, rather than software. Dedicated SLAM processors or “SLAM on a chip” is our ultimate goal and we believe we have the expertise and the team required to deliver this vision.”