According to the case study by Tallon et al, three factors underlie the rise of big data. The first reason, as the authors suggest, is the easy access to good, fast and cost-effective means for organizations and people to store a large amount of data. This, as argued in the case, has made people retain large amounts of data in their devices, including information that is not wanted. This wastage of the storage space has also been primarily influenced by the belief that storage is free. In the long run, storage of excess unimportant data has negatively affected organizations as storage is always limited.
The second reason for the rise of big data is the hype people and organizations have surrounding data analytics. This hype has led to workers retaining a large amount of data for a long time, a consequence unintended by the companies. The third and final reason presented in the case to be a cause of the rise of big data is the regulation that data has to be kept for a particular time for decision making. This regulation is further enforced by case law which affirms that electronic data must be retained for a particular period. The regulations thus make companies store a large amount of information than expected.
When global organizations store a large amount of data, they are faced with several risks. One of the risks that the companies have to deal with is privacy, as storing a large amount of data for a long time makes it vulnerable to leakages (Raguso 187). Security is also a key risk issue that companies face due to storing large volumes of data. Many global organizations like eBay and JP Morgan chase have lost millions of dollars after their storage databases were hacked. Global organizations also face the challenge of storing erroneous data leading to wrong analytics and decision making. Storage of sizeable unnecessary data also results in lousy analytics, especially when unnecessary information leads to the risk of misinterpretation. Collection, storage, retrieval, and analysis of data all require money; hence storing large volumes of data presents organizations with cost risks. According to the article, global organizations can face the risk of e-discovery, business continuity, compliance, and keeping intellectual property safe.
Initially, Intel, like many other organizations, focused on locking down access to data for fear of compromise. This period of data governance is referred to as the protect era of information governance and can be traced back to 1992. The case under study suggests that this era ran from 2003 to 2009 (Tallon et al. 191). The other period is from 2009 onwards and has been named the protect-to-enable era of information governance. Intel’s first data governance model mainly focused on preventing risk by confining data access. With time this strategy evolved to balance the need to protect the data and make it readily available for decision making.
The Sarbanes Oxley legislation first prompted Intel’s strict protection policies, which compelled organizations to protect transaction data. The SQL slammer virus further catalyzed it, which affected Intel’s remote network. The need to set up a system that would protect against future attacks led to the protection era of information governance. Furthermore, the first approach came to be viewed as expensive, risky, and bad for the company’s long-term growth (Tallon et al. 193). One key driver for the transition of intel to protect to enable strategy was changing how people viewed data management. By 2009 the use of personal technological devices and data analytics had become familiar, also acting as a critical driver for the change.
The emergence of data analytics motivated intel to change its policies toward data governance in various ways. Since data analytics required access to data in other parts of the company, intel was forced to abandon its strict data access policies and adopt a more functional system. The emergence of analytics towards the end of the decade forced intel into loosening the restrictions on data to benefit from the emerging data analytics field. Thus, policies toward managing data changed from who could best protect the data to who could best use the data and derive the most value from it. Analytics has also helped Intel cut 25% off the time, thus making it function more efficiently.
Intel rose awareness in its user community about the costs and values of its data using the following methods. First, the company strictly monitored the organization’s costs, risks, and value, and the users knew the monitoring was happening. This, in turn, made the community realize the value of its data, hence a desire to use it cost-effectively. System audits were constantly conducted, and the results were analyzed and presented to the users making them more aware of the costs and value of its data. The protect-to-enable philosophy developed by Malcolm Harkins was promoted to the users where more use of data was promoted but within a set, quantifiable, acceptable risk limits. The company has also made an effort to make the users of its products to be able to differentiate between useful and useless data.
Works Cited
Raguso, Elisabetta. “Big Data Technologies: An Empirical Investigation on Their Adoption, Benefits, and Risks for Companies.” International Journal of Information Management, vol. 38, no. 1, 2018, pp. 187-195.
Tallon, Paul P., James E. Short, and Malcolm W. Harkins. “The Evolution of Information Governance at Intel.” MIS Quarterly Executive, vol. 12, no. 4, 2013.