Organizations today are often said to generate as much digital information, or “big data” in a single day as the entire internet in the year 2000. Such big information is officially defined as “extremely large data sets that may be analyzed computationally to reveal patterns, trends, and associations, especially relating to human behavior and interactions.” Want to learn more about this impressive, real-life information-overload? Here are five, specific examples of it and its utilization today.
Between the 1960s and 1990s, there were many government and human rights groups that were certain of genocide and other crimes against humanity taking place in Guatemala. After the conclusion of decades of strife, the peace-seeking, new governance system of the country happened upon a treasure trove containing millions of hidden documents and records relating to many activities that took place during those years.
Related resource: Top 20 Artificial Intelligence Engineering Schools in the U.S. 2016
Benetech, a non-profit technology organization was then called in to draw conclusions based on this massive trove of millions or inter-related and non-related puzzle pieces. The group was then successfully able to comb the massive data sets and determine that genocide had in fact occurred in the country. Today, Benetech continues a number of advocacy programs such as Martus, global education initiatives, and a few others.
In the 1970s a gentleman by the name of Bill James began researching and experimenting with the use of a new system of metrics and mathematical data analysis that could allow for the answering of specific, statistical questions posed of baseball players’ statistical records. James was an avid baseball fan, but more importantly, a brilliant statistician. James then perfected his system, Sabermetrics, that is still used in statistical analysis today. As a result of his successes, James was then hired on by the Boston Red Sox and will always be respected for his big-data analysis contributions.
Deep Water Horizon Analysis
The Deep Water Horizon Oil Spill was an absolute ecological disaster for the ages. While the underwater leak was active, oil pumped into the sea at never-before-seen rates. As such, part of the remedy at the leak’s closure was to accurately gauge actual amounts of oil leaked as well as the ecological effect expected. To do this, astoundingly massive amounts of data from many different industries had to be compiled and analyzed. In the end, efforts were successful and the National Institute of Science and Technology, the group responsible for the mass calculations, was credited with an impressive, historic utilization of bulk data.
Development of CAD Testing
Coronary Artery Disease, or CAD is a condition in which the arteries of the human coronary system become degraded, and subsequent heart attack risk dramatically increases. In times past, there was no non-intrusive way in which to test for CAD risk in an individual. This all changed though with the use of some very big analysis of massive amounts of data by the company, CardioDX. With the use of these vast sums of data found throughout the scientific and medical worlds, this company was eventually able to create a life-saving CAD test that is regularly used today.
Drew Conway’s Prediction System
Several years ago, now-famed NYU Ph.D student, Drew Conway sought to predict patterns in hostilities and troop movements taking place in wartime Afghanistan. He was ultimately successful but not without the utilization of many terabytes worth of data, much of which was gleaned from Wikileaks. The specific results of his work were those of successfully being able to predict many, important activities within the war-torn country using information such as previous activities, geographic data, weather, and more.
Massive sets of data can be cumbersome to store and sort through. However, those wielding such massive sets of information, if able to translate their meanings, can learn the answer to nearly any related question they may be in search of. These are the basics of big data today as well as five, great examples of the use of such deep data offerings.
More Case Studies: