Advertisement

Understanding day-to-day life of a data engineer solving FinCrime - A conversation with Tafida Balarabe

Understanding day-to-day life of a data engineer solving FinCrime - A conversation with Tafida Balarabe
Understanding day-to-day life of a data engineer solving FinCrime - A conversation with Tafida Balarabe
Tafida B has nearly seven years of fintech experience and works as a Senior Data Analyst (Engineer) at Revolut. He provides valuable insights into building data-driven cultures and navigating the intersection of technology, compliance, and business strategy, drawing on his experience transitioning from customer service to advanced data analysis.
Advertisement

You started in support and analytics, and now you’re a Senior Data Analyst (Engineer). How has moving through these roles changed the way you see data in business?

Advertisement

Even though it may sound crazy, beginning in customer service was actually the best thing that could have ever happened to my career. You gain an instinctual sense of what the business actually needs, versus what we believe it needs technically, when you answer customer questions on a daily basis.

My current approach to data strategy has been totally transformed by that foundation. I have witnessed with my own eyes the enormous discrepancy between what data indicates and what is actually occurring on the ground, as well as how a single data point can make or break a customer's experience and how a single data point can ruin or save a customer experience. I carried that customer-centric mindset with me when I transitioned to analytics and then engineering.

These days, whenever I build data systems or frameworks, I always ask myself: Will this actually help our customers?. Will business stakeholders be able to effectively capitalize on this data? Clean, well-crafted data is important, but it isn't sufficient; you also need to make sure the data actually adds value to your business.

In your line of work, you deal with a lot of raw data that’s hard to understand. How do you turn it into something useful? Can you share an example of a project where you did this?

Advertisement

One project particularly stands out: our fincrime operations teams were spending hours manually extracting data from various sources to investigate potential fraud cases. Since different analysts would examine different data points or timeframes, the process was not only time-consuming but also inconsistent.

Using professional tools like Airflow and Python, I built an automated system that gathered transaction data, user behavior patterns, and external risk signals, processed them through our machine learning models, and presented the results in a business intelligence and data visualization tool, Looker’s dashboard.

The key was that I went beyond simple data collection automation. After working closely with the risk analysts to understand their decision-making process, I designed the dashboard to mirror how they actually conducted investigations.

The result was incredible. What previously took two to three hours per case now required only 15 minutes, but more importantly, we started catching patterns that had been missed before. Instead of spending time gathering data, analysts could finally focus on actual analysis.

In the first quarter after launch, our fraud detection rates increased by more than 40%. What I'm most proud of is that the solution wasn't just technically sound, it transformed how the team worked. With reliable, accessible data finally in place, they started asking different questions and exploring new approaches to help fight fraudsters.

Advertisement

Financial crime analytics was a significant part of your experience. What were some challenges that you faced in detecting and preventing fraud and how did you manage it effectively?

FinCrime is probably one of the most challenging domains in data analytics because you're operating in this constant tension between multiple priorities. You must be accurate enough to avoid false positives that annoy real customers, quick enough to stop fraud in real time, and compliant enough to meet the demands of regulators who are understandably stringent about these matters.

The adversarial nature of the issue presents the largest obstacle; since fraudsters are always changing their methods, your models and regulations must also change. What's effective today may not be effective tomorrow. I recall working on cases where we would identify a new scheme of fraud, put in place countermeasures, and then just sit back and watch as fraudsters would learn to accommodate the changes and try something entirely new a few weeks later.

Technically speaking, having multiple layers of protection and having good monitoring systems are the keys to balance. For instantaneous decisions, we would have real-time models; for in-depth analysis, we would have batch processes; and for edge cases, we would have human review processes. But in all honesty, the human aspect is most important; no algorithm can replace the intuition of a seasoned analyst who knows the big picture.

Because you must be able to explain your decisions, compliance brings additional complexity. It is not a case of "the model flagged this transaction"; you must be able to show evidence of your work, to explain why, and to make everything auditable. Because I had to create systems that are both efficient and transparent and understandable, this in fact made me a better engineer.

Advertisement
Understanding day-to-day life of a data engineer solving FinCrime - A conversation with Tafida Balarabe
Understanding day-to-day life of a data engineer solving FinCrime - A conversation with Tafida Balarabe

Nowadays, everything is moving towards AI. In fact, I’m sure you must have implemented some use of it in your work as well. How do you see AI changing the future of data governance and discovery in large organisations?

At a large scale, the old way of handling data is a bit tricky. You can’t manually keep track of hundreds of sources and thousands of tables spread across teams and time zones. The result is delayed documents and cluttered data. With the help of AI, this all changes. AI can automatically pick up new data sources, understand their structure, and even suggest how they connect to key business metrics

Most exciting, though, is that AI helps with enforcement of governance as much as discovery. Consider having systems that are capable of automatically alerting users when sensitive data is being handled inappropriately or when a key dataset has not been updated within its service level agreement. Instead of focusing on reactive cleanup, we're shifting to proactive governance.

The trick is that AI is designed to enhance human judgment rather than replace it. The most effective systems I've seen offer suggestions and surface insights, but still need context and human validation. We're looking to make data governance intelligent and scalable, not automate it away.

Several African companies are pushing into fintech and digital services. Based on your international experience, what must African startups do to build stronger data cultures and avoid pitfalls?

This is a topic I'm rather passionate about. I've had the chance to work with organisations across the globe, so I've seen how various organisations use data culture. There are definitely patterns that successful companies use.

Begin with governance right away. It may be unglamorous when compared to constructing shiny machine learning models, but believe me that those organisations that get in early on the data quality, documentation, and well-defined ownership structures are the ones which scale. Avoid data quality problems from the very start rather than waiting for them to arise.

Second, democratise data access while putting controls in place. An error I observe is where organisations either give everyone full access to data or sequester it in IT groups. Getting data into the hands of business organisations while placing appropriate controls and training individuals in its usage is the sweet spot.

Cultural diversity is an actual advantage here. Data engineering benefits from African startups' incredible resourcefulness and ability to innovate with less funding. Teams that had to find ways of achieving more with less have created some of the most innovative solutions I've ever seen.

My advice would be: don't try to copy Silicon Valley approaches. Create solutions that are suitable for your environment, your regulatory context, and your customers. Spend money on building local capability – the people who know your market are going to build better data products than expensive consultants who fly in for a few weeks.

And finally, think global but act local. Use international best practices as a point of departure, but operationalise them in your environment. The companies that are going to succeed are those who can bridge that gap effectively.

_---_

#FeaturedPost

Advertisement