How Django is used by Australia’s largest online retailer.
I’m often asked about the choice of Django (and Python) for the base technology stack of kogan.com. People are often surprised that Australia’s largest online retailer is not built in Java, .NET (your typical ‘enterprise-y’ stack) or using an out-of-box enterprise commerce product. I thought it would be good to give people context behind the technology that we use at kogan.com, how it came into being and where it is going.
First there was one
About 6 years ago when Kogan was still a very young business I was co-developing the first Django powered version of kogan.com to replace the initial PHP implementation. To put this period into perspective, AWS and Rackspace weren’t de facto things back then - they too were in the very early days of their cloud services.
So why Django? I first used Django and Python in a final year software engineering project at University and was impressed with the URL routing, ORM, documentation, testing framework and of course the admin interface which comes ‘batteries included’! These elements made building dynamic websites much more pleasurable than using PHP or ASP (with which I had built sites in previously). Moreover, there were also many open-source Django apps that could be leveraged to handle functionality such as a blog or the implementation of a particular payment engine.
In about a month of work, we had put together most of the Kogan website. We set a single instance (app) up on a shared cloud hosting environment with Apache sitting in front of Django. Memcache was used to cache the views and MySQL as the database. Being less experienced at the time - we thought that this set up would do the business well for quite some time!
The Slashdot Effect
Just as we were nearing completion of the website, a current affairs TV show caught wind of the disruption Kogan was causing in the TV industry and filmed a segment on the business. Ruslan called me and let me know that segment was going on air the next day. The site we were building was much better than the older site - so we opted to fast track it into production — overnight, I had managed to tie down the ‘last 10%’ and test the site in production the next day. A few hours from the air time — the new site had received its first order!
When the current affairs show aired and announced the staggering low prices of the Kogan TVs to Australia - we were fortunate enough to experience the phenomenon that was known as the ‘Slashdot effect’. Website traffic skyrocketed and the site was being crushed. We were quickly working with the hosting provider to recover availability on this single machine. In a very short period of time, we were able to add in an open-source Django app that wrote the server HTML responses to file. We set up a list of ‘safe’ urls (like the homepage and list views) to cache in this way and modified the Apache config to try for those files first before hitting the dynamic urls of the site. This meant that despite not having the smoothest experience across the entire site, most of the visitors to the site were now able to discover more about the brand and offering. The setup, architecture and community around Django meant we were able to iterate and deliver in extremely short turn-around.
Scaling out of the box
With the rapid growth of the business, it quickly became apparent that a single shared hosting instance would not cut it! Scaling is typically a difficult problem and has the potential for complicated re-writes. Fortunately Django was designed in a way that made itis possible to horizontally scale the application servers with little effort. The ‘state’ of our Django web app was only relevant between the HTTP request & response cycle - all persistent data (cart, sessions, orders, etc) were stored in the database. This meant that web requests could be safely distributed to many app instances, and the load distributed by DNS round robin. We acquired more shared hosting accounts on various web servers with our provider and configured the DNS so that we could add or remove app servers as needed.
This solution held fast for quite some time with very little maintenance effort, and we even got in control of the ‘digg effect’ spikes that would occasionally hit us from TV exposure. However, the business and the Internet was growing so quickly that we needed new features, complex integrations with external systems and new visual styles and UX and so it came time to redesign and rebuild to what is the current kogan.com platform.
Coming up in Part 2, we’ll go over the current kogan.com architecture and where we see it heading.