Big data are a massive volume of data that are found in the relational data bases of companies. Big data are usually much bigger than the gigabytes of data and may be of the magnitude of petabytes (1,024 terabytes) or exabytes (1,024 petabytes) of data. Data of this magnitude may consist of billions to trillions of records of people found in the social media sites, Web, or data of millions of customers in the data bases of financial and credit card companies. Because of the massive volume of these data, it is difficult to process these data using traditional methods and software. The reason is massive volume of data that are dynamic or constantly changing. Lot of research efforts is devoted now to develop the tools to processes big data. These tools and analysis methods are designed to handle the large amount of data. Analyses of Big data are becoming critical in improving operations and make intelligent and timely decisions.

With conventional methods, it is usually difficult to handle, manipulate, and manage big data. Standard data analysis tools and methods lack the capability of handling and analyzing big data. Current research is showing explosive growth potential for tools, technology, and software capable of handling and analyzing big data. Big data tools are going to be essential part of business analytics.

Big data consist of massive volume of both structured and unstructured data. Structured data are records with fixed field found in relational databases and spreadsheets. These data require defining the fields of data to be stored and whether the data will be stored as numeric, alphanumeric, alphabetic, currency, name, etc. Examples of unstructured data are graphic images, pictures, videos, webpages, pdf files, power point slides, e-mails, blogs, word documents etc.

Structured data are usually managed and manipulated using SQL (Structured Query Language). This is a programming language to manage and query data in relational data bases.

Leave a Reply