This blog post is based on Arnaud Giuliani's talk at KotlinConf 2024, where he shared our incredible journey of building a high-performance Kotlin app, Cloud-Inject, in just two months.
At Kotzilla, we're all about helping developers enhance their architecture through dependency injection, especially using our open-source framework, Koin. Our journey has always been driven by the goal of providing more value through open-source solutions and professional tools, addressing common challenges like technical debt, non-scaling architecture, and collapsing applications.
As you may already know, our adventure started with Koin, our Kotlin dependency injection framework launched in 2017.
Koin has become widely adopted across various ecosystems, from mobile to multiplatform and backend development.
As a company, we provide official enterprise support and help developers enhance their architecture with Koin.
And that's where this part of the story begins. To watch the talk
When we embarked on this project, our goal was to develop a high-performance Kotlin app capable of handling billions of events. Speed was essential, as startups often need to deliver an MVP quickly. Our architecture had to be simple, efficient, and scalable. We chose a stack comprising Ktor for web application development, Koin for dependency injection, Exposed for data storage, and PostgreSQL for the database.
We opted for Google Cloud Platform (GCP), leveraging Cloud Run for container deployment and Google Cloud Storage for massive file handling. This choice allowed us to benefit from auto-scaling and easy deployment.
Our architecture was designed to be straightforward: capturing mobile architecture data and sending it to our servers for processing.
We aimed for a minimalistic stack that allowed us to iterate quickly. Using Ktor and Koin enabled us to move fast and keep our architecture clean and manageable.
GCP's Cloud Run provided us with seamless container deployment and auto-scaling capabilities, crucial for handling the massive influx of events we anticipated.
We used Exposed for data storage and PostgreSQL for our database. This combination allowed us to manage relational data efficiently.
Our initial tests aimed for 100K sessions, but we quickly scaled to handle 1.7 billion events in just two days. This was a testament to our architecture's resilience and performance.
Throughout the development process, we faced several challenges:
We had to implement our own pagination for handling large data sets, a feature typically available in frameworks like Spring Data.
Generating documentation was initially cumbersome. We explored various solutions, including integrating a compendium for better Swagger documentation.
Configuring our database connections for optimal performance on Cloud Run required careful tuning, especially for handling large-scale data processing.
Managing the load and rate limiting was crucial to prevent server overload. We used Ktor's built-in features to control traffic efficiently.
We employed test containers for service integration testing, ensuring our services worked seamlessly together. Dynamic configuration allowed us to switch between local and production environments effortlessly. This setup enabled us to simulate different scenarios and ensure our app's robustness.
As we continue to optimize our production environment, we are exploring new backend solutions and considering native paths for further improvements. Our journey has been a testament to the power of open-source tools and the importance of a well-architected backend.
In just two months, we built a high-performance Kotlin app capable of handling billions of events. This journey, shared by Arnaud at KotlinConf 2024, has been a remarkable learning experience, and we are excited to continue pushing the boundaries of what's possible with Kotlin and open-source development.
A huge thank you to the Kotzilla team for their hard work and to our clients for their trust. If you're interested in joining our adventure, we're always looking for Koin users to test our platform to gain some feedback.