Today, more data is being generated than at any previous point in history. In the short time it will take you to read this post (provided your reading speed is the relatively average 250 words per minute), nearly 10 petabytes* of data will be created worldwide.
*At the estimated rate of 2.5 exabytes of new data created every day, approximately 1,736 terabytes of data are produced every minute. If you read at approximately 250 words per minute (WPM), this blog of ~1,400 words should take about 5.6 minutes to read. Therefore, 5.6 minutes X 1,736 terabytes = 9,722 terabytes or ~10 petabytes.
This explosion of data sources is a new, unique problem for the geospatial intelligence community, as described by Lt. Gen. Vincent R. Stewart during his GEOINT 2017 Keynote:
The remote sensing industry is a prime example of how technological innovation creates more information. Over the past few years, lower costs to launch satellites combined with advances in component miniaturization in the electronics industries have led to the small satellite revolution.
Established players like DigitalGlobe, with our upcoming Scout and WorldView Legion constellations, and new players, like Planet, AstroDigital (that successfully launched its first two satellites in July) and Urthecast, are sending low-resolution, high-revisit small satellites into space. The common goal of all players is to generate the most-recent information.
Small satellites, high-resolution constellations, synthetic aperture radar (SAR) satellite systems (we announced the addition of MDA’s RADARSAT-2 data on DigitalGlobe’s Geospatial Big Data platform during GEOINT 2017) and government satellite systems are generating an unprecedented amount of information about the Earth each day.
In addition to those data sources, more location-based information is generated every second from end-users like those contributing to OpenStreetMap, data created on mobile devices, tracking systems and the internet of things (IoT). With all the geospatial data and non-geospatial data out there, the possibilities for how the datasets can inform each other are endless.
How do we wrangle all that data to make it actionable for the GEOINT community?
For almost three years, DigitalGlobe has been investing in GBDX, the platform we created that dramatically simplifies discovery, interaction and analysis across vast amounts of diverse remote-sensing and location-based information.
A key enabler of GBDX is simplified access to disparate data types through a single mechanism. Thus far we’ve added the entire 100 petabyte DigitalGlobe archive of high-resolution imagery covering the surface of the planet for the past 17 years and the Landsat-8 archive, Sentinel-2 data and MDA RADARSAT-2 data. Our goal is to add more sources of remote-sensing content over time.
But we didn’t stop there. Vector Services, a component of GBDX, also contains many vector datasets, including Gazetteer data, OpenStreetMap data (updated daily), social media data, GDELT, Anthrometer, ACLED—and other unique vector-based datasets highly useful when performing analysis. We’ve also made it effortless to add data, so that organizations can bring their own data types directly into the environment.
Making it easy
Lots and lots of satellite imagery, radar data and vector data all in one place—that’s fantastic. But dealing with overwhelming amounts of data is hard, right? How is it any easier with it in one place? Do I have to learn an entirely new system?
A critical challenge for organizations faced with an explosion of information sources is access. Interacting with multiple disparate systems to access all appropriate data creates technical challenges that reduces the value of the new data sources. Some organizations store copies of the information locally in their own environments, increasing both technical complexity and storage costs.
By centralizing and standardizing access to the largest (and most rapidly growing) set of geospatial content available, our GBDX platform provides organizations with an elegant, simple way to access rich content now and richer content going forward, eliminating the complexities described above.
Additionally, organizations don’t need to worry about licensing, entitlements or where the data lives. They can focus on getting the value out of the data without having to make huge investments to integrate with multiple data providers and individual delivery mechanisms or worry about how and where to store this firehose of data.
Simplifying access is only the first benefit. DigitalGlobe recognizes there are many different user personas who currently work with, or desire to work with, these types of data. At GEOINT 2017, I did a talk, “Location Intelligence—Answers at Your Fingertips: Accessing the DigitalGlobe Platform,” and discussed the methods we make available to users to take advantage of this data.
The right tools for the users
Several types of users work with remote sensing today. We’ve found that all these users want to be able to discover, interact with, and analyze remote sensing data—but each within their own preferred workflows and tools.
- Developers: Within software development efforts via APIs, pre-built development libraries and within deep learning frameworks.
- GIS analysts: Within modern and leading-edge GIS software tools.
- Imagery analysts: Within remote sensing and imagery analysis software tools.
- Data scientists: Within modern data science software tools and methods.
- Consumers: In a simple, intuitive, easy-to-use, web-based interface that enables them to discover and interact with the data and results of analysis performed by the above users.
What are we doing to make this a reality?
For GIS users, we announced an incredibly exciting partnership with Esri at the 2017 Esri User Conference. GIS users who use Esri’s new ArcGIS 10.5 Enterprise product can now have a direct connection to DigitalGlobe’s platform via our Imagery+Analytics subscription. This enables organizations to directly access our entire imagery archive and run advanced analytics from Harris Geospatial, CrowdAI and other cutting-edge analytic providers directly on our data, all from within the world-class Esri tools these users rely on daily.
For imagery analysts, we partnered with Harris Geospatial to import a large number of the Harris ENVI analytic suite capabilities to GBDX for advanced multispectral imagery analysis. We implemented advances in our platform to make serving these results back to users within the desktop tools they use simple and easy.
For data scientists, we’re incredibly excited about our acquisition of Timbr.io in 2016. We can’t wait to unveil a new and exciting way to analyze and interact with data for developers and data scientists later this year, which will make working with DigitalGlobe’s GBDX even easier.
For consumers who want to easily interact with our content, we offer a new Discover interface. It makes discovering and previewing our imagery simple and straightforward. This is just one of many initiatives DigitalGlobe is pursuing to make interacting with the best remote-sensing content, curated data, information layers and analysis results easy to find and use.
Our founder Dr. Walter Scott compares what DigitalGlobe is becoming to Amazon. When you shop on Amazon, you can explore and purchase myriad products—from toothpaste to auto parts and everything in between. You don’t need to think about where the products are warehoused, stocked, packaged or prepared for shipment. DigitalGlobe is working to create a similar experience with geospatial information from multiple sources in one, trusted, convenient location. We make it easy for customers to access an array of products and data from disparate sources. We set it up so the customer doesn’t have to worry about licensing the information or downloading times. In the end, it is our goal to become the ultimate wrangler of data so customers can unlock actionable insight for their business.