According to Janssen, van der Voort and Wahyudi (2017), the three factors influencing the utility of big data analytics in decision-making are the velocity, variety, and veracity of business data. When organizations collect big data from various sources, the variation in the qualities of this data, and the fact that various entities within an organization process them, lead to a large data chain. The velocity of business data implies that an organization constantly changes the way it collects data, and it reflects the nature of the environments in which modern businesses operate. Business organizations grapple with constant changes in their operating environment, and to operate effectively, they must measure the impact of these changes. Measuring how environmental factors affect the ability of a business to attain its goals requires the frequent collection of data at various points in time, resulting in constant changes in the nature of business data.

Your 20% discount here!

Use your promo and get a custom paper on
Analytical Leadership

Order Now
Promocode: SAMPLES20

The variety of big data reflects how business data measures multiple aspects of an organization’s performance indicators. For instance, sales data for various periods can reveal several insights on macro-environmental and micro-environmental factors. If sales have declined over a given period, it could mean the external environmental factors have not been favorable to a business organization. Declining sales can suggest that, among others, an economic recession has lowered the purchasing power of an organization’s customers and cultural changes have transformed the tastes and preferences of a firm’s customers. For organizations to leverage data effectively when making decisions, they must understand its variety.

The other factor influencing how organizations benefit from big data is its veracity, and this connotes how the manipulation of data affects its validity in measuring various aspects of organizational performance. The entities that supply an organization’s data have various interests and motives that may make them biased in how they measure variables in order to arrive at such data. Therefore, before employing data for analytical decision-making, organizations must verify data and ensure its validity.

Janssen, van der Voort and Wahyudi (2017)’s argument reflects Eckerson (2012)’s suggestion of how the new analytical ecosystem seeks to manage the potential impediments to the use of data analytics in decision-making. According to Eckerson (2012), the new business intelligence framework has two aspects that work mutually, and it seems that the coordination of the functions of these aspects helps minimize the adverse effects of data velocity, variety, and veracity on the effectiveness of decision-making. The first element is the top-down business intelligence that allows causal users of business data to monitor and report various business trends. The organizational objectives and operating strategy guide the users on the trends they should monitor, while a data warehousing architecture facilitates the filing of reports. To ensure data is not volatile, pre-defined metrics serve as the benchmarks against which data users evaluate certain business trends. The outcome of the monitoring effort in the top-down intelligence element serves as the input into the second element – the bottom-up analytical intelligence.

In the second element, an organization’s analytical staff uses advanced tools to analyze the reports from the casual users and predict important trends. The analytical staffs use specific business processes and projects as the guiding framework for big data analytics. To ensure that data analysis yields objective results, Ad hoc queries, rather than predefined metrics, help in the evaluation of important trends. The outcome of the bottom-up process for gathering analytical intelligence serves as the input for the top-down process for developing business intelligence. The simultaneous collection of business intelligence and its analysis ensures that an organization has quality data that is neither volatile nor largely varied, and this enhances the utility of data to the decision-making process.