In today’s world there is more data being generated than ever before. This is due to advancement in technology that has enabled faster processing and transmission of data. Big data is simply data that’s too big to fit in traditional data processing and storage systems. This data is large, transmitted at high speed and contains both simple and complex data structures (text, numbers, images, videos and sound). As the famous saying goes “data is the new oil”, many organizations are putting their focus on leveraging big data to extract valuable insights, create new products and counter risks in their businesses. As with extraction of oil it needs specific tools and processes so do the data processing. We have big data tools such as Hadoop, spark etc. that enables skilled users to use in extracting value from big data. In this post we will look at what big data is its properties, challenges and various use-cases.


Characteristics of Big Data

  1. Volume. The volume of big data is too large to be stored or processed by the traditional systems.
  2. Variety. Big data contains data with varied types both structured and unstructured.
  3. Velocity. Big data is data that’s transmitted at high speed. This data needs to be processed at real-time for quicker decision making.
  4. Veracity. The data is only useful if it contains valuable insights that can be trusted.
  5. Value. Big data should have greater value.

Challenges of Big Data

  1. Limited Skilled Professionals.
  2. Securing Big Data.
  3. Big Data Technologies and Tools Selection. There many big data tools with different capabilities. This makes it difficult to select a perfect tool as when the business case is not well defined.
  4. Continuous Growth of Data. The more the data the harder it becomes to store, process and manage. This requires more scalable resources which increases operational cost.
  5. Data Integration. Integrating big data from various systems is challenging as data comes in various complex and non-compatible formats.

Big Data Use-Cases

  1. Healthcare. Healthcare industry has seen much generation of data due to digitization of healthcare services. The data is used for various tasks such as predicting patient’s diseases and for improving the patient services.
  2. Government. Most government have digitized there services and are generating big data in various sectors. The governments are leveraging these data in various areas such as planning, budgeting and predicting national growth among other use-cases.
  3. Insurance. Insurance firms generate and stores large datasets about clients. The data is being used to process claims, detect fraud and predict churn among other use-cases.
  4. Telecommunication. Telecom industry is one of the biggest generator of big data. The data about users, calls, text and other telco activities is stored and processed with big data technologies. This data is used to study user behaviour, detect fraud and to improve service delivery.
  5. Information Technology. Companies like Facebook, google and amazon generate tons of user data through their online services and products. These data is processed and used to improve user experience, recommend products and create new data products.
  6. …. And many other use-cases.

Big Data Technologies

Technologies for handling and processing big data can vary depending on the vendor and the organization using big data. Below are few tools and technologies used for handling big and processing big data.

  1. Big Data storage. These are scalable and distributed storage systems such as Hadoop and Spark among other.
  2. Machine Learning. Machine learning is a branch of computer science that enables computer programs and system to learn from data without being explicitly programmed. Machine learning is at the core of processing and extracting value from data both big and small.
  3. Business Intelligence. Big data business intelligence tools are useful in providing actionable insights from data to business users.
  4. Data Visualization. The easiest way to understand insights from data is through visualizations. Used together with business intelligence tools business can visualize and easily interpret big data faster for quicker decision making.
  5. Cloud computing. The most scalable way of storing and processing big data is through cloud. Cloud provides a distributed and flexible way of working with big data.


Big data has a huge potential in giving an organization a competitive advantage. Many organizations are leveraging big data to extract valuable insights, create new data products and prevent risks in business. When carefully used the value from big data outperforms its cost by far. Challenges with big data includes limited skilled professionals, integrating big data among others. In the next post we will look at Apache Spark which is a scalable big data storage and processing system. To learn about data augmentation check our post here.

Introduction to Big Data

Post navigation

0 0 votes
Article Rating
Notify of
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x