Burger King challenged EDHEC MSc in Data Analytics & Artificial Intelligence students
How was the Burger King hackathon organised?
In the Business Data Management course, we learned many things relating to data treatment analysis for business purposes, such as a churn prediction for the company Burger King. Our two professors from Accenture gave us introductions to the tools we would be using prior to the hackathon on AWS (Amazon Web Services) with Athena and SageMaker for notebooks, as well as a presentation of the context of the analysis from Burger King executives.
We were 7 people in my group, and we decided to check our tools and datasets before the hackathon to get started on strong grounds, such as knowing the provided information, or successfully importing the data, and be ready for the D-day. The hackathon started at 9 am with opening explanations and closed at 5 pm but we were still working until late in the evening. All teams had a Q&A session with our professors at some point of the day.
What solution did you come up with?
The goal was to hand over a churn score algorithm and a presentation of our insights for Burger King.
We analysed churning for Burger King in terms of predictions first. Through modelling, we found for instance that some type of campaigning encourage churning, while others keep a customer loyal. Other factors such as the channel, or personal characteristics of a customer, like age, parenting, can also play a role for or against churning. The dashboarding added value in visualising the profile of the customers Burger King may want to target. The content of our analysis results are very valuable for a company such as Burger King: marketing campaigns, investments, distribution strategy decisions can be based on data analysis. However, I’m afraid I cannot disclose the results in more details!
The jury thought our presentation was insightful and so my group ended in the top three for the final in front of Burger King, and we won one week later! Hopefully, we may earn some menus one day thanks to this hard work!
How did you reach it?
We divided the project into 4 main tasks: data pre-treatment, churn analysis, visuals through dashboarding, and presentation building. We all worked more specifically on one of the three first tasks, and altogether on the presentation. Happening in May, the hackathon was completely online and so we were on the phone and screen sharing in small groups all day, and altogether at some points of the day.
Our group was competencies diverse and so it became quite easy to manage our assets accordingly, but also to learn from another person with a different specialty during the hackathon. This is due to the organisation of the Msc in Data Analysis and Artificial Intelligence, which allows us to choose different option courses. For instance, Alice Barbotin, Alexandre Suberbielle, Thaïs Lewko and I chose the same Machine Learning track and so we easily work with data analysis in Python, whereas Dan Yekhi, Rose Moulan and Edwin Nguyen are more comfortable with SQL and visuals.
We were provided confidential datasets containing customer loyalty data, marketing campaigns, franchise data. We first had a look at each one of them and figured out ways to keep the most information to serve our analysis and add value. After struggling with the AWS interface and more specifically SageMaker Notebooks, we decided to split, and do the maximum of the data treatment in SQL, then work in Python for the machine learning part, and dashboarding in Tableau in parallel. Moreover, we all worked on the PowerPoint presentation simultaneously. For the final, I presented with two other teammates.
Discussing together all day long helped us to analyse Burger King’s churn quicker, sharing our results and conclusions between small working groups. We tried different machine learning models, such as Neural Network, or XGBoost, which gave the best performance. After modelling, we searched for significant variables that discourage or encourage churning. With these results and the help of the visualisation in Tableau, we got to our solution.
What are the main takeaways of this experience?
Of course, the hard skills. First of all, I learned how to work on AWS, one of the most widely used platform in companies for the jobs the Msc in DA & AI prepares us for. Dealing with the interface, and several of the available tools was great since I could not have accessed them on my own. This also had a downside: we shared accesses in the class and so I learned that working all together on AWS can cause many computing issues. This issue, common in various problems I dealt with in this Msc, led us to find an alternative solution and favour various adapted tools instead of using a single one (in my case, often Python or R) for all the data treatment and analysis. For this hackathon, I wanted to learn something new, and after we saw that the data pre-treatment would be more efficiently done in Athena with SQL than in SageMaker with Python, I decided to stick around with Dan all morning and learn SQL, a language I had never touched before. I’m glad to have done it because learning hands-on was faster than I expected, and so together we did the cleaning, the joins, the feature engineering of our project.
Then, working remotely is always a challenge, even when you work on digital tools. So I learned how to handle it more efficiently, for instance, favour very small (1 to 3 people max) groups to encourage thinking for specific tasks; and work altogether for all idea sharing, strategy thinking, organizational aspects of a project.
All in all, the hackathon was a great opportunity for me to practise the theory with real data and widely used tools.