This job is the first time I have ever worked with a large amount of data and predictive methods in order to estimate the effectiveness of future marketing efforts. Below are a few challenges that I think data analysts may face:
Documentation and Universal Truth
The data and analyses can only be helpful when accepted by everyone, or at least almost everyone. Different departments need to be on the same page on definitions and calculation methods. Finance can’t have a different understanding of profit and revenue than Marketing does.
Also, the more data there is, the more important it is to have extensive and meticulous documentation. Everybody in an organization needs to be aware what each schema means, what each table means, what each column means, what each value in the column means and how the tables are connected with one another. Without a careful and detailed documentation as well as universal knowledge, an organization will encounter a lot of waste and inefficiency in operations.
A large amount of data requires a powerful machine to retrieve it from a data warehouse. An insufficient piece of equipment such as a 4GB in RAM in a dated computer would mean hours of lost time and inefficiency. Also, it has to be said that data warehouses should be reliable. An offline data warehouse will render data analysts almost useless.
After data is retrieved, the next time is to clean, process and present data. Tools such as Tableau are awesome, but security concerns from compliance or IT can be a hindrance in adopting the tools. Plus, applications such as Tableau are expensive. If there are only a few individuals in an organization having access to it, the usefulness will be limited and the investment will not be as fruitful as it could be.
Hence, Excel is still a popular choice among organizations for data analysis. However, when it comes to processing a large dataset or using pivot tables with a lot of slices and filters, Excel is notorious for crashing more often than you can stomach. Furthermore, Excel isn’t a great visualization tool in my opinion. Presenting data to management or leadership teams usually demands sleek visuals to aid understanding and easy preparation for such visuals to save time. Unfortunately, Excel isn’t strong in either.
Connecting data to an outsourced application
Not every application that is useful to your job is internal. Sometimes, an external application is necessary. For instance, you may want to have a predictive analytics tool that can’t be built in-house. For the tool to work, you need to feed it with real data on a regular basis as much as possible since predictions often stem from historical data. However, getting data out of an organization’s walls can be a daunting task because of compliance and security concerns. Plus, ensuring that the data sent to the external application is correct and updated regularly is a time-consuming challenge. Data needs to be verified carefully. Files should be sent manually in time and in a format requested by the vendor. Automation of a data feed is ideal, but it would involve some programming and collaboration with IT and compliance.
Working at my job so far has been an eye-opener, especially in terms of working with huge datasets. I was shadowed on the job by a high schooler a few days ago. I explained to her what my job entailed and what we do everyday. I hope through this post I did shed some light on the challenges data analysts face. There are likely more, but I think these three are popular and among the biggest.