[ad_1]

“The physical world is where we have most of our problems, because it is so complex and fast moving that things are beyond our perception to fully understand,” says Brandon Barbello, a cofounder who is also Archetype’s COO. “We put sensors in all kinds of things to help us, but sensor data is too difficult to interpret. There’s a potential to use AI to understand that sensor data—then we can finally understand these problems and solve them.”

When I visited Archetype’s founding team of five, currently working out of a cramped room in the Palo Alto office of its lead funder, venture capital firm Venrock, they showed me some illuminating demos that, they assured me, only hinted of Newton’s vast potential impact. They placed a motion sensor inside a box and prompted Newton to imagine that the container was an Amazon package with fragile cargo that should be carefully monitored. When the box was dropped, the display running the model broke the news that the package might be damaged. One can easily imagine a shipment of vaccines with motion, temperature, and GPS sensors monitored to verify whether it will arrive with full effectiveness.

One key use case is using Newton “to talk to a house or chat with a factory,” says Barbello. Instead of needing a complex dashboard or custom-built software to make sense of the data from a home or industrial facility wired with sensors, you can have Newton tell you what’s happening in plain language, ChatGPT style. “You’re no longer looking sensor by sensor, device by device, but you actually have a real-time mirror of the whole factory,” Barbello says.

Archetype’s AI model Newton takes in data from different sensors and combines and convert it into plain language descriptions about what’s happening in the physical world.

Courtesy of Archetype

Naturally, Amazon—owner of some of the world’s most digitally sophisticated logistics operations—is one of Archetype’s backers, through its Industrial Innovation Fund. “This has the potential to further optimize the flow of goods through our fulfillment centers and improve the speed of delivery for customers, which is obviously a big goal for us,” says Franziska Bossart, who heads the fund. Archetype is also exploring the health care market. Stefano Bini, a professor at UC San Francisco’s Department of Orthopaedic Surgery, has been working with sensors that can assess the recovery progress after a person has knee replacement surgery. Newton might help him in his quest for a single metric, perhaps drawn from multiple sensors, that “can literally measure the impact of any intervention in health care,” he says.

Another early Archetype client is Volkswagen, which is running some early tests of Archetype’s model. Surprisingly, these don’t involve autonomous driving, though Archetype very much wants its technology to be used for that. One Volkswagen experiment involves a scenario where a car’s sensors can analyze movement, perhaps in concert with a sensor on a driver’s person, to figure out when its owner is returning from the store and needs an extra hand. “If we recognize human intention in that scenario, I can automatically open that back gate, and maybe place my stuff into specially heated or cooled locations.” says Brian Lathrop, senior principal scientist at Volkwagen’s Silicon Valley innovation center. That mundane task, believes Lathrop, is just the beginning of what becomes possible when AI can digest reams of sensor data into human-centric insights. Volkswagen’s interests include the safety of people outside vehicles as well as passengers and drivers. “What happens when you network all those cameras from those millions of vehicles on the roadway, sitting in parking lots, on driveways?” he says, “If you have AI looking at all these data feeds, it opens up an incredible amount of possibilities and use cases.”

It’s not hard to imagine the dark side of a trillion-sensor monitoring system providing instant answers to questions about what’s happening at any location in its dense network. When I mention to Poupyrev and Barbello that this seems a trifle dystopian, they assure me they’ve thought of this. As opposed to cameras, they say, radar and other sensor data is more benign. (Camera data, however, is one of the sensor inputs that Archetype can process.) “The customers we are working with are focusing on solving their specific problems with a broad variety of sensors without affecting privacy,” says Poupyrev. Volkswagen’s Lathrop agrees. “When we’re using Archetype software, I’m detecting behavior, not identity. If someone walks up to my wife and tries to grab her purse, that’s a behavior you can detect without identifying the person.” On the other hand, there’s evidence that the way people walk—something high-quality radar might well detect—is as distinctive as a fingerprint. Just sayin’.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *