11/21/2024 | Press release | Archived content
Planet joined The Mapscaping Podcast to discuss our mission, what our data enables organizations to do, and how AI can be leveraged to do it better.
The Mapscaping Podcast, hosted by Daniel O'Donohue, is a great resource for anyone interested in the important work being done across the geospatial industry. And recently, Justin Davis, Senior Product Manager of Analytics at Planet, joined the show to talk about our current capabilities - specifically how we are leveraging artificial intelligence (AI). Read on for the highlights, or check out the full episode for all the details.
Throughout Justin's time at Planet, he has focused on computer vision, machine learning, and AI services, with the goal of "making it easier for our users to get value out of this massive amount of data that we're generating."
"Regardless of your industry at this point," he shared, "generative AI and … the revolution that has been happening in artificial intelligence is going to impact what you do." Here are some examples of how Planet is leveraging AI today.
Synthetic data is just what it sounds like, artificially generated data. It's created using a number of physical process models and is used by many major technology companies to do things as varied as building guidance for autonomous vehicles to developing insights for the financial sector.
Synthetic data is a huge part of the AI conversation right now as it can be used to quickly bootstrap training datasets for computer vision models. It's a quick way to understand how well a model performs or to generate a lot of data to create a new model. As it relates to Planet, you still need actual observations (satellite imagery in this case) to power and validate these models.
As part of Planet's groundbreaking hyperspectral offering, the Planet Hyperspectral team leveraged synthetic data to facilitate product development in advance of launching the Tanager satellites into orbit. To do this, Planet partnered with Rendered.ai to develop and build out synthetic scenes. The simulated hyperspectral data provided by Rendered.ai allowed Planet customers to provide early feedback on their experiences with the hyperspectral offering prior to launch.
Supervised and unsupervised machine learning are common training models used in AI. Essentially, supervised machine learning uses labeled data, whereas unsupervised machine learning does not. In the podcast, Justin explores both approaches.
In supervised machine learning, most of the data needs to be labeled from the start so the model can understand what it's trying to find. Broadly defining the objects the model is looking for allows for calculated improvements in the future, and helps avoid being pigeon-holed into a narrow definition that misses out on valuable use cases after spending months training a model. While this is a tried and true method, Justin shared that "it kind of locks you into a definition of what you're trying to do."
For example, with Road and Building Change Detection, Planet initially labeled roads as anything a truck could safely and reasonably drive on, and buildings as anything a human could stand underneath. From there, Justin shared that the team was able to iterate based on customer feedback.
With unsupervised machine learning, you have a bit more freedom. Rather than defining exactly what the model should look for up front, it will analyze the data as a whole and group similar characteristics. For example, instead of the model looking for roads as they were defined by you, the model would innately group imagery with roads based on shared attributes. With this approach, you can work backwards pretty easily since you are able to define each feature decision after the fact.
Justin also shared examples of how Planet satellite data is being used to train AI models. Planet partner, RAIC Labs (formerly known as Synthetaic), leveraged Planet's daily satellite imagery and archive data to build and train an AI model to identify and trace the flightpath back-to-origin of the Chinese balloon that captured national attention when it flew across North America.
New developments throughout the industry are advancing what's possible even beyond similarity searches and image classification. There's a signal that we may be able to derive what's actually happening in any given area through time, via novel AI approaches.
Justin envisions the possibility of a foundational model of satellite imagery that organizations and individuals could adapt, similar to how large language models are being used now. Although he notes it would take a lot of data to train and create this model initially, it could be a possibility one day based on what's already developing in the market.
When asked about the most exciting - and challenging - things on the horizon, Justin shared that at our core, Planet is still trying to do what it set out to do over a decade ago: keep up with how the world is changing. And that is both an exciting and evolving challenge.
At Planet, we like to say, "You can't measure what you can't see." That saying is very fitting as we look to the future. We continue to image the Earth (almost) every day, but the tools and techniques we utilize to do so continue to transform. And Justin's work is a continual process of deciding where to stop, what to dive deeper into, and what to productize for broader use.
"We have this treasure trove of data that is sitting there begging for someone… to find all the gold in it," Justin said.
Catch the full 45 minute episode with Justin Davis and Daniel O'Donohue to learn more about Planet products, use cases, and how it all comes together on Planet Insights Platform.