When Progressive Insurance decided to move its 15 billion miles (80 round trips to the Sun) of Snapshot driver data into a Big Data solution, one of the key questions was “How do we ensure it all copies over accurately?” This was no “copy this table over to that table” kind of project. It involved using a brand new architecture, taking 20 tables and compacting them into 7 tables, recomputing and adding several fields, ensuring accuracy of all the data for the analysts, and building tests that could be run quickly and repeatedly. Manual testing would be impossible at such a scale so an automated testing solution was developed. Attendees will learn in this session: how this daunting task was approached, why certain testing decisions were made, the pitfalls and successes of the testing effort, and how to mitigate testing risks for their future Big Data projects.