Extracting Big Data insights should be simple

Researchers and data scientists working on paradigm-changing problems such as mapping the human genome project should be able to focus on what they do best—the science—and not have to concern themselves with the logistics of setting up a cluster to solve such problems.

They now can.

Leverage a powerful platform

Pivotal Analytics Workbench is the largest Hadoop cluster available publicly. It has the equivalent memory of over 49,000 iPhone 5’s stringed together and enough storage for 10 years worth of Tweets. In other words, enough horsepower to handle any data set that you can throw at it.

Bring it.

Analyze large datasets

Researchers and data scientists at universities or companies working on Big Data and MapReduce can analyze both structured and unstructured data on the Analytics Workbench’s mixed mode environment and validate at scale. Intel, NASA and Booz Allen Hamilton have utilized its capabilities to scale test new products, automate video analysis, and study weather patterns.

It's free to use

Pivotal Analytics Workbench is available for use in 90-day segments, at no cost. It’s free. Really.

How it works »

No maintenance required

Pivotal provides the infrastructure and makes sure it keeps running. The Hadoop cluster is continually updated to support the latest software releases so you’ll always have access to the latest cutting-edge technologies. If you need specific software on the cluster for your project, we’ll even help with that.

Learn more »

Projects on AWB

“Alpine Data Labs has been running performance tests against the cluster for several months now, and we've been able to establish benchmarks that prove the scalability of our solution and the flexibility of the GPHD platform. What's really cool is that we were able to deploy the Alpine application onto the cluster in just a few minutes, and then easily build analytics workflows that performed things like regressions, decision trees, clustering algorithms and visualizations on very large datasets. And so we proved once and for all that it's not only possible to run sophisticated machine-learning algorithms on billions of rows and terabytes of data, but also that you can do it without writing a line of code.”

—Alpine Data Labs
“The technology set that the workbench provides has helped us immensely in our scalability testing. It has allowed us to ensure our Identity Resolution product is high performing and scales to meet the needs of the ever growing volume of identity data.”

—Informatica
“We were highly impressed by the level of expertise of the EMC/Pivotal Analytics Workbench team that facilitated the execution our 1000-node benchmarks of Apache Accumulo. It was apparent from working with them that they have significant experience managing large scale cluster environments.The willingness of the team to work with us closely to ensure our project's success was outstanding.”

—Accumulo
See projects in Pivotal AWB Gallery »
Access the raw power of Analytics Workbench.