Latest Hosting Posts
The worldwide digital revolution has ushered in the era of big data, and there's no turning back.
Big data is an umbrella term that covers the collection of vast quantities of complex information as well as the analysis of this data in an attempt to find relationships between different data sets.
A May 2014 White House report on both the promises and challenges of big data said that estimates of the amount of data generated and replicated in 2013 reached a staggering 4 zettabytes, up by 2.2 zettabyes from the 2011 estimate of 1.8 zettabytes of data.
What's a Zettabyte?
To help you understand the size of a zettabyte, the White House report offered the following examples: "War and Peace," the 1,250-page novel by Leo Tolstoy would fit into a zettabyte 323 trillion times. Or, a zettabyte could accommodate the photo files that would be created if every American took a digital photo every second of every day for more than a month.
The sheer magnitude of big data is difficult to comprehend.
Moreover, so vast are these collections of data that they exceed the ability of traditional analytical methods to effectively process them.
As a result, "big data uses various algorithms and techniques to infer general trends over the entire set," according to the Electronic Privacy Information Center.
New Way to Analyze Data
This represents a new, more generalized approach to data analysis, as opposed to the traditional methods that sought out exact matches between individual pieces of data, according to EPIC.
The White House report cited multiple examples of benefits that already have been realized as a result of big data….
- Sensor-equipped delivery trucks and jet engines monitor hundreds of data points and can send alerts to operators to indicate when maintenance is needed.
- The Centers for Medicare and Medicaid Services has begun using predictive analytics software that can spot likely instances of reimbursement fraud before the claims are paid.
- Analysis of millions of samples from monitors in neonatal intensive care units helped develop an early warning system to identify infants at high risk of developing life-threatening infections.
However, in the process of moving to a new system of data analysis capable of dealing with big data's immense volume, many of the traditional methods of protecting individual privacy may no longer be adequate to handle these new challenges.
And as the author of "Privacy Risks in Big Data: What Are the Legal Aspects?" points out, the U.S. Constitution offers no blanket protection for individual privacy, contrary to popular belief.
The White House big data report notes that concerns over the privacy risks associated with the collection and processing of "small data" have been "effectively addressed . . . through the Fair Information Practice Principles, sector-specific laws, robust enforcement, and global privacy assurance mechanisms."
How to Manage Big Data?
So quickly has the transition to big data taken place that technologists, privacy scholars, and policymakers have yet to agree on how -- or whether -- the risks of big data can be effectively managed under existing privacy frameworks?
The White House report says the search for a revamped or new privacy framework equal to the challenges posed by big data should take into consideration four key areas of concern:
- "How government can harness big data for the public good while guarding against unacceptable uses against citizens;
- "The extent to which big data alters the consumer landscape in ways that implicate core values;
- "How to protect citizens from new forms of discrimination that may be enabled by big data technologies; and
- "How big data affects the core tenet of modern privacy protection, the notice and consent framework that has been in wide use since the 1970s."
Don Amerman is a freelance author who writes extensively about a wide array of business and website topics.View Don Amerman`s profile for more