Posted on

Deep Science: Dog detectors, Mars mappers and AI-scrambling sweaters

Research papers come out at far too rapid a rate for anyone to read them all, especially in the field of machine learning, which now affects (and produces papers in) practically every industry and company. This column aims to collect the most relevant recent discoveries and papers, particularly in but not limited to artificial intelligence, and explain why they matter.

This week in Deep Science spans the stars all the way down to human anatomy, with research concerning exoplanets and Mars exploration, as well as understanding the subtlest habits and most hidden parts of the body.

Let’s proceed in order of distance from Earth. First is the confirmation of 50 new exoplanets by researchers at the University of Warwick. It’s important to distinguish this process from discovering exoplanets among the huge volumes of data collected by various satellites. These planets were flagged as candidates but no one has had the chance to say whether the data is conclusive. The team built on previous work that ranked planet candidates from least to most likely, creating a machine learning agent that could make precise statistical assessments and say with conviction, here is a planet.

“A prime example when the additional computational complexity of probabilistic methods pays off significantly,” said the university’s Theo Damoulas. It’s an excellent example of a field where marquee announcements, like the Google-powered discovery of Kepler-90 i, represent only the earliest results rather than a final destination, emphasizing the need for further study.

In our own solar system, we are getting to know our neighbor Mars quite well, though even the Perseverance rover, currently hurtling through the void in the direction of the red planet, is like its predecessors a very resource-limited platform. With a small power budget and years-old radiation-hardened CPUs, there’s only so much in the way of image analysis and other AI-type work it can do locally. But scientists are preparing for when a new generation of more powerful, efficient chips makes it to Mars.

Read More

Posted on

Lidar helps uncover an ancient, kilometer-long Mayan structure

Lidar is fast becoming one of the most influential tools in archaeology, revealing things in a few hours what might have taken months of machete wielding and manual measurements otherwise. The latest such discovery is an enormous Mayan structure, more than a kilometer long, 3,000 years old, and seemingly used for astronomical observations.

Takeshi Inomata of the University of Arizona is the lead author of the paper describing the monumental artificial plateau, published in the journal Nature. This unprecedented structure — by far the largest and oldest of its type — may remind you of another such discovery, the “Mayan megalopolis” found in Guatemala two years ago.

Such huge structures, groups of foundations, and other evidence of human activity may strike you as obvious. But when you’re on the ground they’re not nearly as obvious as you’d think — usually because they’re covered by both a canopy of trees and thick undergrowth.

“I have spent thousands of hours of fieldwork walking behind a local machete-wielding man who would cut straight lines through the forest,” wrote anthropologist Patricia McAnany, who was not involved in the research, for an commentary that also appeared in Nature. “This time-consuming process has required years, often decades, of fieldwork to map a large ancient Maya city such as Tikal in Guatemala and Caracol in Belize.”

You can see an aerial view of the site below. If you didn’t know there was something there, you might not notice anything more than some slightly geometric hills.

[embedded content]

Lidar detects the distance to objects and surfaces by bouncing lasers off them. Empowered by powerful computational techniques, it can see through the canopy and find the level of the ground beneath, producing a detailed height map of the surface.

In this case the researchers picked a large area of the Tabasco region of Mexico, on the Guatemalan border, known to have been occupied by early Mayan civilization. A large-scale, low-resolution lidar scan of the area produced some leads, and smaller areas were then scanned at higher resolution, producing the images you see here.

What emerged was an enormous ceremonial center now called Aguada Fénix, the largest feature of which is an artificial plateau more than 10 meters tall and 1.4 kilometers in length. It is theorized that these huge plateaus, of which Aguada Fénix is the oldest and largest, were used to track the movement of the sun through the seasons and perform various rites.

The high-resolution lidar map also helped accelerate other findings, such as that, owing to the lack of statues or sculptures in honor of contemporary leaders, the community that built Aguada Fénix “probably did not have marked social inequality” comparable to others in the 1,000-800 B.C. timeframe (calculated from carbon dating). That such an enormous project could have been accomplished without the backing and orders of a rich central authority — and at a time when Mayan communities were supposed to be small and not yet stationary — could upend existing doctrine regarding the development of Mayan culture.

All because of advances in laser scanning technology that most think of as a way for self-driving cars to avoid pedestrians. You can read more about Aguada Fénix in Nature and this National Geographic article.

Read More

Posted on

Volvo to use Luminar’s lidar in production vehicles to unlock automated driving on highways

Volvo Cars will start producing vehicles in 2022 that are equipped with lidar and a perception stack — technology developed by Silicon Valley startup Luminar that the automaker will use to deploy an automated driving system for highways.

For now, the lidar will be part of a hardware package that consumers can add as an option to their Volvo vehicle, starting with the second-generation XC90. Volvo will combine Luminar’s lidar with cameras, radar, software and back-up systems for functions such as steering, braking and battery power to enable its highway pilot feature.

Volvo, which is known for making its advanced safety features standard, sees a bigger opportunity in its partnership with Luminar. The Swedish automaker said Luminar will help it improve advanced driver assistance systems and may lead to all of its second-generation Scalable Product Architecture (SPA2) vehicles to come with lidar as a standard feature.

Luminar and Volvo didn’t reveal how much this highway pilot package might cost. Luminar has previously said its Iris lidar unit will cost less than $1,000 per unit for production vehicles seeking full autonomy and about $500 for version used for more limited purposes like driver assistance.

The announcement is a milestone for Luminar and its whiz founder Austin Russell, who burst onto the autonomous vehicle startup scene in April 2017 after operating for years in secrecy. It also makes Volvo the first automaker to equip production vehicles with lidar — the light detection and ranging radar that measures distance using laser light to generate a highly accurate 3D map of the world around the car.

Luminar’s Iris lidar sensors — which TechCrunch has described as about the size of really thick sandwich and one-third smaller than its previous iterations — will be integrated in the roof. The sensor’s tucked away placement is a departure from the bucket style spinning lidars that have become synonymous with autonomous vehicle development.

Image Credits: Volvo

Shipping a vehicle with the proper hardware and perception stack doesn’t mean customers will be able to let their Volvo take over driving on highways from the get go. The software, which is being developed by Zenuity, is still underway, Volvo CTO Henrik Green said.

The software will be activated wirelessly once it is verified to be safe in individual geographic locations. Volvo will continue to expand the capability of the software such as pushing up the maximum speed a vehicle can travel while driving autonomously. This hardware first-continual software update strategy is similar to Tesla, which has sold an automated driving package to consumers for years that has improved over time, but still does not allow for so-called “full self-driving.”

“Soon, your Volvo will be able to drive autonomously on highways when the car determines it is safe to do so,” Green said. “At that point, your Volvo takes responsibility for the driving and you can relax, take your eyes off the road and your hands off the wheel. Over time, updates over the air will expand the areas in which the car can drive itself. For us, a safe introduction of autonomy is a gradual introduction.”

A turning point for lidar

Lidar sensors are considered by many automakers and tech companies an essential piece of technology to safely roll out autonomous vehicles. In the past 18 months, as the timeline to deploy commercial robotaxi fleets has expanded, automakers have turned back to developing nearer term tech for production vehicles.

“It’s a very isolated problem to solve and becomes a lot more solvable in a safe way than trying to solve autonomous driving through the inner city of Los Angeles or San Francisco,” Green said. “By narrowing the use case to those particular highways, we can bring safe autonomy into vehicles for personal use in the timeframe we’re talking about.”

Advanced driver assistance systems, or ADAS, that was pushed aside in pursuit of fully autonomous vehicles has become a darling once again. It’s prompted a pivot within the industry, particularly with lidar companies. Dozens of lidar startups once grappling to become the supplier of choice for fully driverless vehicles are now hawking their wares for use in regular old passenger cars, trucks and SUVs. Some lidar startups such as Luminar have developed the perception software as well in an effort to diversify their business and offer a more appealing package to automakers.

The companies will deepen their collaboration to ensure Luminar’s lidar technology is validated for series production. Volvo Cars said it has signed an agreement to possibly increase its minority stake in Luminar.

Luminar built its lidar from scratch, a lengthy process that it says has resulted in a simpler design and better performance. The company made a leap forward in April 2018 with the introduction of a new lidar unit that performs better, is cheaper and is able to be assembled in minutes rather than hours. Luminar also acquired Black Forest Engineering as part of its strategy to improve the quality along with efficiency. And it opened a 136,000-square-foot manufacturing center in Orlando, Florida, where it does all of its engineering and development as well as the mass manufacturing.

The startup has continued to improve its lidar as well as attract investors. Luminar announced last year it had raised $100 million, bringing its total to more than $250 million. The company unveiled a perception platform and its compact Iris lidar unit, which will now go into the Volvo.

“This is really kind of the holy grail that we’ve been working towards for the entire course of the business,” Russell said.

Read More

Posted on

As autonomy stalls, lidar companies learn to adapt

Lidar sensors are likely to be essential to autonomous vehicles, but if there are none of the latter, how can you make money with the former? Among the industry executives I spoke with, the outlook is optimistic as they unhitch their wagons from the sputtering star of self-driving cars. As it turns out, a few years of manic investment does wonders for those who have the wisdom to apply it properly.

The show floor at CES 2020 was packed with lidar companies exhibiting in larger spaces, seemingly in greater numbers than before. That seemed at odds with reports that 2019 had been a sort of correction year for the industry, so I met with executives and knowledgeable types at several companies to hear their take on the sector’s transformation over the last couple of years.

As context, 2017 was perhaps peak lidar, nearing the end of several years of nearly feverish investment in a variety of companies. It was less a gold rush than a speculative land rush: autonomous vehicles were purportedly right around the corner and each would need a lidar unit… or five. The race to invest in a winner was on, leading to an explosion of companies claiming ascendancy over their rivals.

Unfortunately, as many will recall, autonomous cars seem to be no closer today than they were then, as the true difficulty of the task dawned on those undertaking it.

Source: Gadgets – TechCrunch

Posted on

Baraja’s unique and ingenious take on lidar shines in a crowded industry

It seems like every company making lidar has a new and clever approach, but Baraja takes the cake. Its method is not only elegant and powerful, but fundamentally avoids many issues that nag other lidar technologies. But it’ll need more than smart tech to make headway in this complex and evolving industry.

To understand how lidar works in general, consult my handy introduction to the topic. Essentially a laser emitted by a device skims across or otherwise very quickly illuminates the scene, and the time it takes for that laser’s photons to return allows it to quite precisely determine the distance of every spot it points at.

But to picture how Baraja’s lidar works, you need to picture the cover of Pink Floyd’s “Dark Side of the Moon.”

GIFs kind of choke on rainbows, but you get the idea.

Imagine a flashlight shooting through a prism like that, illuminating the scene in front of it — now imagine you could focus that flashlight by selecting which color came out of the prism, sending more light to the top part of the scene (red and orange) or middle (yellow and green). That’s what Baraja’s lidar does, except naturally it’s a bit more complicated than that.

The company has been developing its tech for years with the backing of Sequoia and Australian VC outfit Blackbird, which led a $32 million round late in 2018 — Baraja only revealed its tech the next year and was exhibiting it at CES, where I met with co-founder and CEO Federico Collarte.

“We’ve stayed in stealth for a long, long time,” he told me. “The people who needed to know already knew about us.”

The idea for the tech came out of the telecommunications industry, where Collarte and co-founder Cibby Pulikkaseril thought of a novel use for a fiber optic laser that could reconfigure itself extremely quickly.

We thought if we could set the light free, send it through prism-like optics, then we could steer a laser beam without moving parts. The idea seemed too simple — we thought, ‘if it worked, then everybody would be doing it this way,’ ” he told me, but they quit their jobs and worked on it for a few months with a friends and family round anyway. “It turns out it does work, and the invention is very novel and hence we’ve been successful in patenting it.”

Rather than send a coherent laser at a single wavelength (1550 nanometers, well into the infrared, is the lidar standard), Baraja uses a set of fixed lenses to refract that beam into a spectrum spread vertically over its field of view. Yet it isn’t one single beam being split but a series of coded pulses, each at a slightly different wavelength that travels ever so slightly differently through the lenses. It returns the same way, the lenses bending it the opposite direction to return to its origin for detection.

It’s a bit difficult to grasp this concept, but once one does it’s hard to see it as anything but astonishingly clever. Not just because of the fascinating optics (which I’m partial to, if it isn’t obvious) but because it obviates a number of serious problems other lidars are facing or about to face.

First, there are next to no moving parts whatsoever in the entire Baraja system. Spinning lidars like the popular early devices from Velodyne are being replaced at large by ones using metamaterials, MEMS, and other methods that don’t have bearings or hinges that can wear out.

Baraja’s “head” unit, connected by fiber optic to the brain.

In Baraja’s system, there are two units, a “dumb” head and an “engine.” The head has no moving parts and no electronics; It’s all glass, just a set of lenses. The engine, which can be located nearby or a foot or two away, produces the laser and sends it to the head via a fiber-optic cable (and some kind of proprietary mechanism that rotates slowly enough that it could theoretically work for years continuously). This means it’s not only very robust physically, but its volume can be spread out wherever is convenient in the car’s body. The head itself can also be resized more or less arbitrarily without significantly altering the optical design, Collarte said.

Second, the method of diffracting the beam gives the system considerable leeway in how it covers the scene. Different wavelengths are sent out at different vertical angles; A shorter wavelength goes out towards the top of the scene and slightly longer one goes a little lower. But the band of 1550 +/- 20 nanometers allows for millions of fractional wavelengths that the system can choose between, giving it the ability to set its own vertical resolution.

It could for instance (these numbers are imaginary) send out a beam every quarter of a nanometer in wavelength, corresponding to a beam going out every quarter of a degree vertically, and by going from the bottom to the top of its frequency range cover the top to the bottom of the scene with equally spaced beams at reasonable intervals.

But why waste a bunch of beams on the sky, say, when you know most of the action is taking place in the middle part of the scene, where the street and roads are? In that case you can send out a few high frequency beams to check up there, then skip down to the middle frequencies, where you can then send out beams with intervals of a thousandth of a nanometer, emerging correspondingly close together to create a denser picture of that central region.

If this is making your brain hurt a little, don’t worry. Just think of Dark Side of the Moon and imagine if you could skip red, orange, and purple, and send out more beams in green and blue — and since you’re only using those colors, you can send out more shades of green-blue and deep blue than before.

Third, the method of creating the spectrum beam provides against interference from other lidar systems. It is an emerging concern that lidar systems of a type could inadvertently send or reflect beams into one another, producing noise and hindering normal operation. Most companies are attempting to mitigate this by some means or another, but Baraja’s method avoids the possibility altogether.

“The interference problem — they’re living with it. We solved it,” said Collarte.

The spectrum system means that for a beam to interfere with the sensor it would have to be both a perfect frequency match and come in at the precise angle at which that frequency emerges from and returns to the lens. That’s already vanishingly unlikely, but to make it astronomically so, each beam from the Baraja device is not a single pulse but a coded set of pulses that can be individually identified. The company’s core technology and secret sauce is the ability to modulate and pulse the laser millions of times per second, and it puts this to good use here.

Collarte acknowledged that competition is fierce in the lidar space, but not necessarily competition for customers. “They have not solved the autonomy problem,” he points out, “so the volumes are too small. Many are running out of money. So if you don’t differentiate, you die.” And some have.

Instead companies are competing for partners and investors, and must show that their solution is not merely a good idea technically, but that it is a sound investment and reasonable to deploy at volume. Collarte praised his investors, Sequoia and Blackbird, but also said that the company will be announcing significant partnerships soon, both in automotive and beyond.

Source: TechCrunch