Data Science Studio

Scaling onboarding at MadKudu
Product Management

Ensuring growth after product-market fit

With my team, we built a data science studio that helped MadKudu grow its customer base without needing to hire more people.

Quantitative results:
Reduce customer onboarding time by 70%
Qualitative results:
Allowed the MadKudu team to scale at 30% quarter over quarter
work brief

When I first joined MadKudu, each customer required 140 man-hours to onboard. This was not sustainable given the ambitious growh goals and limited human resources. With my team, we released a data science studio that managed to build a predictive model within 40 hours.


MadKudu is a B2B marketing operations platform which offers an enterprise-grade, real-time API that tells you the value of your prospects wherever they are. Operational marketing leaders from the fastest growing SaaS companies such as Segment, Amplitude and InVision leverage this engine to simplify their workflows and unlock revenue across the entire buyer journey.

I joined MadKudu when they were still an 8-person team in 2018. It was early days and we had 2.5 people in Customer Success - doing everything from technical onboarding to business success. GIven that MadKudu had just hit product-market fit at the time, it meant that both the technical architecture of the platform and the customer onboarding process has not scaled yet. This meant that the Customer Success team had a lot of flexibility to onboard customers as fast as they can and/or add in features to meet the need of their customers.


At the time, it required 140 man-hours to onboard a customer end to end. There are many different steps required to onboard a customer, and most of the steps are repeated since they've not been automated into the platform as yet.

Tracking hours required for each onboarding step


Adopting a top-down approach of ensuring that product goals meet company goals, here are the three long-term goals of the company:

  1. Solidify the process of delivering success to customers
  2. Build a revenue engine that scales with more resources
  3. Build a marketing ops community to drive stronger brand presence

Since the three goals are almost a "domino effect" where one affects the success of the other, the most important one to succeed in is to solidify the process of delivering success to customers. This involves two things: (1) productizing technical onboarding steps as part of the automated platform and (2) providing a UI layer for the steps that require human input.


Given the stage of the company, the tradeoffs were abundant:

  • Build and launch integrations with high-leverage partners in the SaaS industry
  • Build and test customer-facing onboarding flow
  • Build and launch freemium product to increase top-funnel metrics
  • Test platform capability in another part of the customer's funnel

The decision to prioritize and focus did not come easy. Initially, I made the mistake of overpromising the number of metrics we could move in a quarter, for example: the number of Sales Qualified Leads (SQLs) from a product launch and the number of hours it takes to onboard a customer. However, after one month, it is obvious that this is detrimental to the team's velocity to ship features.

Then, I went back to the drawing board to use data and metrics to understand how to make the tradeoff. Looking at the company metrics, I saw that there are two clear patterns:

  1. Company's growth is highly dependent on Customer Success' ability to onboard customers. There is a pattern where one quarter sees a great ARR growth and the next quarter is spent onboarding the new ARR gained which leads to very low growth rates.
  2. Top-of-funnel metric growth is steady from organic sources. Without any marketing efforts, Sales is still able to generate enough meetings to meet the organic growth required.
  3. Churn is low. At this stage, customers trust the expertise of the MadKudu team so customer adoption is not too much of a challenge.

Thus, looking at the above parrterns, it is clear that customer onboarding time is the most important metric to move in order to grow the companyb at a scalable rate.



The process of prototyping the features required are very much iterative. I adopted the methodology where:

  1. I sketch out the feature (either on drawing or Balsamiq Mockups).
  2. I present the low-fidelity sketches to my founding team. At this point, we debate on the user requirements and platform functionality to see if there are "obvious no-s" to look out for.
  3. I build high-fidelity mock-ups (on Sketch) after 1-2 iterations.
  4. I share the high-fidelity mock-ups to my Customer Success team to drive excitement and ask for feedback to ensure there are no obvious blind spots with the customer needs.
  5. Depending on feedback, there may be minor iterations required here.
  6. Once approved on both levels, I build requirements to accompany the mock-ups and send them to Engineering.

Some of the key features required to build out the data science studio are as per below:

Univariate Analysis: A feature that points out the best-performing traits to drive conversions
Trees: A feature to assemble different traits that would weed out and select the best-converting leads
Overrides: A feature that allows implementation specialists to manually override the predictive model using rules
Computations: A feature that allows implementation specialists to

Development Process

At the time, I was working with two back-end and full-stack engineers based in Paris (remote).

We had a weekly sprint process, following the "agile" methodology, starting on Wednesdays:

  1. The epics and their priorities would be set at the beginning of the quarter.
  2. Product Requirement Documents (PRDs) for each epic are written in Notion with the ability to add comments, and tracked in Asana for Engineering lead to manage.
  3. The statuses of the PRD review would be updated on Asana by the Engineering lead. If accepted, it moves on to Engineering sprint. If not accepted, it may require discussion between Product and Engineering, or refinements from Product team.
  4. At the end of the sprint, Engineering team would share a progress report on each PRD.

Testing and Iteration

Phase 1

Since the features are very data-intensive and require multiple rounds of testing before it can be user-ready, we do alpha releases first to the internal team to migrate  one of our customers to run the end-to-end steps for that one feature.

At this point, there are chances that the feature could go back to being in development phase where we find that iterations are required to make the feature work. This follows the same development process where a PRD would be written, prioritized then developed and tested.

Phase 2

Once the feature is successfully working for one customer end to end, we move to the beta release which is a guided user release for at least 5 customers. At this point, this is great to collect feedback and see if there are any gaps in the feature. If there are, it will be logged as a feature improvement and go into the PRD backlog.

Phase 3

Once the feature has been successfully tested by users on 5 customers, it now gets a general user release and users can officially start using the features with training or documentation. If the feature is big enough, we also may do an official launch on ProductHunt.


Post the release of the data science studio, the number of hours it takes to onboard a customer went down from 140 hours to 40 hours => 70% reduction. This enabled MadKudu's Customer Success team to successfully onboard $450K worth of ARR in 6 months.


Connect with me here