What circumstances led to the rise of the big data ecosystem?
There are many factors that contributed to the emergence of today's big data ecosystem, but there's a general consensus that big data came about because of a range of hardware and software designs that simply allowed big data to exist.
Webinar: Big Iron, Meet Big Data: Liberating Mainframe Data with Hadoop & Spark Register here |
A conventional definition of big data is as follows: Data sets that are sufficiently large and complex that they defy easy iterative management, or management by hand. Big data sets are often identified as data sets that can't fit into a simple database network, because their analysis requires too much work on the part of the servers handling the data.
With that in mind, a major part of what created big data is the idea we know as Moore's Law, or the doubling of transistors on a circuit every two years, creating ever smaller hardware and data storage devices (as well as more powerful microprocessors). In conjunction with Moore's Law, and probably because of it, the computing ability of accessible software systems kept increasing, to the point where even personal computers could handle much larger amounts of data, and business and vanguard systems started to be able to handle data sizes inconceivable only several years before. Personal systems moved from kilobytes to megabytes, and then to gigabytes, in a process transparent to consumers. Vanguard systems moved from gigabytes to terabytes and petabytes, and onto orders of magnitude like zetabytes, in ways that were much less transparent to the average citizen.
Another advance accommodating big data was changes in the ways that handlers processed data sets. Rather than linear processing through a conventional relational database design, handlers started to use tools like Apache Hadoop and related hardware management pieces to eliminate bottlenecks in data processes.
The result is the big data world that we live in, where massive data sets are stored and maintained in data centers, and increasingly accessed by a wide range of technologies for a wide range of uses. From commerce to ecology, from public planning to medicine, big data is becoming more and more accessible. Meanwhile, government agencies and other larger organizations are still pushing the boundaries of big data sizes and implementing even more advanced solutions.
Tags
Written by Techopedia Staff

At Techopedia, we aim to provide insight and inspiration to IT professionals, technology decision-makers and anyone else who is proud to be called a geek. From defining complex tech jargon in our dictionary, to exploring the latest trend in our articles or providing in-depth coverage of a topic in our tutorials, our goal is to help you better understand technology - and, we hope, make better decisions as a result.
More Q&As from our experts
- Can there ever be too much data in big data?
- How can businesses solve the challenges they face today in big data management?
- What does the mobile network state mean?
Related Terms
- Random Access Memory
- Local Area Network
- Network Diagram
- Network Science
- Network Theory
- Cybersecurity
- Data
- Electronic Data Interchange
- Backward Chaining
- Charles Babbage
Related Articles

The Evolution of Big Data

Big Data's Got a Problem, But It Isn't Technology

Top 14 AI Use Cases: Artificial Intelligence in Smart Cities
Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
- The CIO Guide to Information Security
- Robotic Process Automation: What You Need to Know
- Data Governance Is Everyone's Business
- Key Applications for AI in the Supply Chain
- Service Mesh for Mere Mortals - Free 100+ page eBook
- Do You Need a Head of Remote?
- Web Data Collection in 2022 - Everything you need to know