Saved 100s of hours of manual processes when predicting game viewership when using Domo’s automated dataflow engine.
Starting a new TV show may not be the riskiest investment you make this year. Sure, you might fall in love with a production that gets canceled after just one season. But for the networks investing millions in production, the stakes are much higher.
Decisions about renewing shows have traditionally relied on viewer numbers, measured through Nielsen ratings. But the rise of streaming services and social media has complicated things. Now, there are a variety of data sources measuring viewership, each with its own approach.
“Because of streaming, there are several new data sources that have come into the fold,” says John Worth, a senior analyst for advanced data solutions with A+E Global Media. “And not only are they different data sets, but the way they are measuring data is vastly different from one another.”
Strategies for data preparation efficiency in a changing landscape
A+E, like many companies affected by digital disruption, found itself with a mountain of new data to collect and clean up for analysis. It diverted resources from delivering strategic insights to data maintenance. Instead of giving in to the time suck, A+E chose not to accept chaotic data preparation as a cost of doing business by using valuable tools that offer a mix of automation, reusability, and governance.
If you’re struggling to keep your data clean, these strategies for data preparation efficiency can help, no matter what industry you’re in. With the right approach, you can reclaim valuable time.
1. Get out of Excel
Fifteen minutes may not seem like a lot of time. Just a fraction of an hour was all it took to manually put together A+E’s daily competitive ranker, which listed the top 50 telecasts in the 25-to-64 demographic, pulled from Nielsen’s ratings. In a highlighted Excel chart, staff across the company could easily see where programming from the History Channel or Lifetime landed for the day.
Of course, 15 minutes a day amounts to more than seven hours a month. But as you jump from a single source of data to more than 40 different sources, those 15 minutes a day working in Excel can quickly compound to an unsustainable allotment of time.
It’s a conundrum that many companies like A+E face and one that John Le, founder of Dashboard Dudes, says he has a clear answer for: Stop using Excel for everything.
“There are much better analytical tools out there to clean up data or convert something from one format to another,” Le says. “But people don't know how to use those tools and they're not willing to invest the time to learn how to use them.”
While Excel gurus may be able to quickly join together a couple data sets and launch into analysis on a piecemeal basis, Le cautions they could be sacrificing scalability for speed. Tools that allow you to automate repetitive data processes may take some initial work to configure, but they ultimately save employees time when preparing and distributing regular reports.
Companies should fight the urge to stick with the familiar and start shopping around for solutions that meet the moment. In the long run, it’s time, money, and a lot of headaches saved.
2. Spot the glaring variations
If you’re like the A+E team staring down a mountain of data from a plethora of new sources, it can be daunting to make sense of data sets that capture key performance indicators using different methods and metrics. This is where data prep work comes in.
Le likens it to cooking prep. “A lot of people just take raw ingredients and try to put them on a plate right away,” he explains. “They don't focus so much on the prep work, or they have to prep in the middle where they're cleaning and cutting, but things aren't consistent. You end up getting a plate that has carrots with the stem on them or onions that are different sizes.”
The same principles apply to working with data. You need to spend time standardizing your data sets by ensuring that all of your naming conventions match or your values are being calculated the same way. For example, slight variations in how the title of an A+E show is captured in data sets can lead to flawed analysis that recognizes the same show as multiple distinct titles. Analysts need to decide from the outset whether the title will be captured as “Oak Island” or “The Oak Island.”
“It’s a simple but very infuriating problem when you’re dealing with millions and millions of rows of data,” Worth says.
Normalizing data, he says, is a combination of creativity, art, and science, which requires teams to provide clear definitions and formulas. Once those definitions are established, you’ll have to think about how to apply them across your data.
Le adds that there are some simple things to watch out for:
- Abbreviations: Spell out full names (“ACCOUNT” and “ACCT”).
- Casing: Make it easy for yourself by using capitalization across the board.
- Formatting of dates: Choose one numeric date format and stick with it.
- Names of locations: Again, stay consistent with your data entries (USA vs US vs United States).
- Extra spaces: Trim down your column titles to ensure there are no extra spaces
- Extra symbols: Watch out for symbols like dollar signs.
- Words in numeric columns: Inputting “N/A” as a value could break your data.
3. Establish good governance
Normalizing a metric like “impressions” was an uphill battle for A+E. Not only did each data source calculate its metrics differently, but frequently, the individual companies would change their calculation methods. These changes often disrupted A+E’s formulas and data.
“We had a major disruption with Snapchat’s API about a year ago,” Worth explained. “They changed a few of the ways that their primary metrics were being named, while limiting or increasing certain metrics. We basically had to do an entire overhaul of the pipeline, restructure the data set—it took probably about three to six months.”
This challenge underscores the importance of good data governance. You need to understand how your data is coming in and who has access to it.
“I had a vendor who changed the word ‘impression’ to ‘impressions’,” Le says. “Why did this change? We discovered they were manually processing the file in Excel. You’ve got to stop doing that because it's little mistakes like these that can lead to major problems.
Being able to standardize entry, assign ownership, track edits, and trace lineage of data are all especially important for ensuring reliable data analysis. You’ll need to make sure that the right users have the appropriate level of access to your data; not everyone needs to be in the weeds. And consider whether tools to streamline data intake might create consistency across your data products and build trust in the accuracy of your data.
4. Automate your adjustments
So how does A+E actually apply their naming conventions to get consistency across the data for their “Oak Island” programming?
“It’s still a bit of a bottleneck in terms of manual labor because there are so many variations,” admits Worth. “And then once you account for all of the observed variations…you learn of a new one and it breaks the data set.”
However, the company sees potential in using AI not only for capturing abnormalities but also helping to predict new ones. Le agrees that this approach is the right way to think about integrating AI and automation into your data prep.
“When I get these data files from people, I don’t manually do anything anymore,” he explains. “I set it all up in Domo to do the job for me and make sure everything runs smoothly.”
That includes creating instant alerts if data entries don’t make sense, like a negative number in a sales column. He can also automatically convert variations to the appropriate naming convention to ensure that an input of “United States” reads as “USA.” And, he can use AI to quickly understand more about the data sets he is working with. Using a simple prompt, he can quickly learn how a date is formatted and ask for a reformat in a new CSV file or ask to remove entries before a specific date.
“AI is very good at looking at data and doing some of the grunt work and more repetitive tasks,” Le says.
So if you’re looking to speed up your cleaning process, invest in learning to make the most of the automation available in your tools. And learn how to write clear, specific prompts for AI to provide useful results.
Investing in a smarter way to data prep
Now that you’ve recognized the need for better tools that can collect and clean up your data without all of the manual hassle, how do you actually go about deciding on which tools to use?
“When I'm looking at a new tool, I’m thinking about how easy it is for me to do something without having to write any code?” Le says. “I know how to do it, but I know another user is not going to want to. So it really is about the user experience and how easy it is for them to adopt it.”
If you’re searching for the tool with the best user experience, it’s important that you make the right considerations before agreeing to any purchase.
Ten questions to ask in any data prep sales demo include:
- How easy is it to spot inconsistencies, incorrect values, or missing values in your data sets?
- Are there methods to standardize definitions and naming conventions for your data and to apply those rules across multiple data sets?
- What sources can your tool easily connect to for centralized data intake?
- How much manual clean up is necessary when ingesting and combining data sets?
- What features are available to automate repetitive data prep tasks?
- How easy is it to visualize the steps in the data preparation process and to make adjustments as needed?
- How can you manage access to data sets to ensure team members have the right information?
- Can users easily filter through data without affecting the underlying data set?
- How does the tool respond when data inputs inadvertently change and a data set breaks?
- What level of training is needed for non-data experts to use the tool?
See how A+E improved in data preparation efficiency
If you’re feeling the pressure of too much data and too little time, you’re not the only one. Data professionals in all industries are stretched thin.
In their breakout session at Domopalooza 2025, A+E shares how they’ve tackled data overload with automation, good governance, and the right tools. You’ll hear firsthand how they’ve reclaimed valuable hours and supported a fast-moving, multiplatform content business without getting buried in spreadsheets.
Watch the full session replay to see what worked and how you can apply the same principles in your own data prep.
Author

Joseph Rendeiro is a freelance writer with an extensive background covering topics related to business administration, entrepreneurship, team work, and psychology. He has spent the past 8 years creating content highlighting faculty fieldwork and research at accredited higher education institutions.