How Technology doubles the orders for The Chefz (Food delivery App) orders in 2 Months.

Ramzi Alqrainy
6 min readJun 2, 2021

The Chefz is a food delivery company that serves as an intermediary between customers and restaurants. Using the app, customers place (and pay for) an order, which is then conveyed to the participating restaurant.

When the food is prepared, The Chefz collects the food and delivers it to the customer in the comfort of their own home, office, or wherever else they’ve chosen. The Chefz takes a cut of the revenue and pays the delivery.

According to GlobalWebIndex, if the delivery is fast and free, you’ll be getting somewhere. The Chefz currently charges variable fees for delivery (based on distance and delivery time). Aside from that, users like to see a good variety of restaurants, with a good quarter valuing the chance to use local businesses rather than faceless chains.

It’s worth getting these customers in through the proverbial door because once they’re using your app, they will continue to do so it seems. A study by professional services firm McKinsey found that, in the KSA, 84% of food delivery customers had never or rarely switch platforms.

Search And Find Food Easily

This is one of the most important features of the food delivery app.

After logging in and profile customization, the user is going to look for what they should eat.

They might have an idea, or they might be just looking to see what looks good. In any case, the feature of easy search and find is going to be super useful.

The best way to make it works is to have a smart list of all the restaurants and cuisines organized by location, type of food (fast food, or more gourmet dishes), food preferences (vegetarian, meat lovers, any special diets like gluten-free diets), different nation’s cuisine, etc.

With this smart list feature, you are only going to make it easier for the user to find what they wanna eat. All they have to do is to look through these lists to find recommendations of all the restaurants catering to the specific list.

The most important and coveted functionality that has to be provided by the system is the capability of searching on menu items, cuisines, restaurants, among other things. In the food ordering journey, this functionality will be the point of entry for all the customers, unless they already have a favorite restaurant in their preferences from which they can select dishes. Thus a personalized discovery & search experience based on a customer’s past search and order history has to be provided. As is apparent, this particular part of the entire system will be read-heavy.

The Food Delivery Infrastructure (Performance is the key)

It helps to identify the main features of any delivery app and what distinguishes it from other systems.

  1. Highly Variable Workload: Delivery apps are among the most challenging to design as they face a constant barrage of requests. Depending on the vertical, loads can peak during specific time windows in a day (food delivery apps) or during specific days in the year (holiday season gifts). Also, requests shut off once night falls in some sectors.
  2. No margin for error: Missing requests is not an option in the competitive world of logistics. Also, it is difficult to evaluate how critical a specific parcel is for the client and losing customers early on does not bode well.
  3. Regular State Updates: Maintaining logs of package states is the norm and requires both large database storage capacity along with high throughput.

AWS Auto Scaling plays a critical role for The Chefz. With Amazon EC2 Spot Instances, the company reviewed its applications and saw repeated issues tied to availability and termination that were fairly easy to solve. To assure high availability, the company added redundancy by running multiple instances so that if one instance terminated, another one started immediately.

The Chefz have a queue in place to process asynchronous updates to the search cluster. When the Restaurant Profile Service creates/updates a restaurant/menu data by performing CRUD operations on the database, it can also post an event to the queue. This event could be any of the CRUD. We need a data indexer that listens to the queue for such events. The data indexer then picks up the event, runs a query against the database to formulate a document as per the correct format, and posts the data into the search cluster.

Replication & Fault Tolerance

It is imperative that we identify the various points of failure in the system and try to have a replica/backup for each if any component dies. Each service should be horizontally scalable. The NoSQL infrastructure will also have multiple nodes. The search system can have multiple nodes set up. Queues can have partitions and replication as well. Each and every service/component can be individually scaled if needed. Autoscaling could be enabled to handle loads on individual components, for example, spinning up more instances in the face of a high peak load. Furthermore, in case any of the nodes go down, or any partition in the queueing infrastructure goes down, another instance can take up the job. The crashed node can then do a cleanup and restart using a process called self-healing.

Caching

Based on the recent orders in an area, or the most ordered items, or searched items, data could be cached so that the Restaurant Search Service could look up such information from a distributed cache, instead of hitting the search infrastructure, and immediately return few recommendations. Images of restaurants and dishes could be cached as well, instead of hitting the object storage all the time. The cache will hold the popular or most-ordered menu items/restaurants in a particular area and the search screen should show those options by default. The usage of cache definitely speeds up the viewing of search results.

Some of the popular cache technologies available in the market are Redis, hazelcast, and Memcached. Least Recently Used (LRU) or Least Frequently Used (LFU) algorithm, or a combination of both can be our cache eviction strategy for our use case.

Load Balancing

Every service will have multiple instances running, in scenarios where the communication is over HTTPS POST/GET, etc load balancers should be placed in front of the individual services. Load balancing helps make sure that no one instance is inundated with a deluge of requests and the response time is faster overall. There are several load balancing techniques that could be used like Least Connection Method, Round Robin method, etc.

In case the service solely subscribes to a particular channel/topic in the queue, the responsibility of load balancing is on the queue itself.

Conclusion

Observing a real-life scenario such as food delivery at a restaurant allows us to learn the intricacies behind the operations and draw parallels with the software world.

Delivering food is particularly interesting in this context since speed alone is not the name of the game. It is a delicate balance between the speed of delivery & the taste of the food.

--

--

Ramzi Alqrainy

Apache Solr Contributor | Slack Contributor | Speaker | Chief Technology Officer at The Chefz| Technical Reviewer for Big Data Books