Anko_Global capability center SmallAnko_Global capability center SmallAnko_Global capability center SmallAnko_Global capability center Small
  • Home
  • Culture
  • Diversity And Inclusion
  • About
  • Careers
  • Blog
  • Contact
✕
Retail Ranging & Recommendation (RRR) @ Anko
April 28, 2023
Anko Workplace Culture: Empowerment in Action!
August 4, 2023
June 21, 2023

Smart Digital Receipts @ Kmart

Overview:  

Kmart Australia has introduced digital ‘smart’ receipts for in-store and Online purchases where customers get a digital copy of the transaction receipt on their mobile through SMS or through the OnePass Digital Wallet application. By switching to digital smart receipts, we can give our customers a more seamless shopping experience, by taking away the stress of needing to keep hold of a physical printed receipt, which is important to us as we continue to work through ways, we can reduce our environmental impact. By introducing digital smart receipts, we aim to deliver a more enjoyable and convenient shopping experience to those customers who prefer to use technology to manage their receipts, however a paper receipt will continue to remain available for those customers who instead prefer a printed copy.

Benefits of introducing Digital Receipt 

  1. Saves cost - Millions of dollars are saved from switching to digital receipt from a paper-based receipt. 
  2. Improves marketing - As we store the digital footprint of the data in our data warehouse, this is used to study OnePass customer behavior, send personalized messages along with receipt and know about our customer better. Digital receipts also feature links to Kmart websites which can contribute to more customer conversions. 
  3. Easier storage - Paper receipts consume a lot of physical space in our establishment, customers also can lose many receipts or fade in time. Digital receipts solve these problems by storing all the information in the cloud for both customers and retailers to access at any point of time. 
  4. Free delivery - Customer can subscribe to OnePass to enjoy free delivery on thousands of eligible items at Kmart with no minimum spending limit. 
  5. E-Receipts Offer Instant Customer Feedback - Getting feedback from customers can improve customer retention rate.  

In this blog, we cover high-level technical architecture and how architecture evolved.   

The Problem:  

To implement these smart digital receipts for Kmart and provide our customers with seamless shopping experience we had to implement a real-time data integration solution between Kmart and OnePass, and between Kmart and Slyp (Banking system to send digital receipt SMS to Customers). Also, no data program is complete without data analytics. So, we had to ensure that the data is also available at our cloud data warehouse where the data analyst can crunch numbers and provide meaningful insights related to the program.    

Solution:  

Just like any data integration project the initial requirement was simple - send the transaction data from one system to another by making use of synchronous communication (yes APIs) – and so, start with the Monolith. To put it simply, this is how it was supposed to look. 

No alt text provided for this image

Sounds simple right? Well turns out, it wasn’t that simple.   

Each source system, store or online generates data fields which are very specific to the channel. So, we had to transform this data in flight and convert the payload into a canonical model set up by OnePass and Slyp. Yes, there are 2 consumers. Perhaps you guessed it, wait for it 😉 (not mimicking Barney).   

Also, a lot of descriptive information about the products being sold is not available in the source systems. This means in the transformation in flight we also had to enrich the payload.   

So, it would turn out we need a box in between to perform the tasks above. To put it in a diagram, this is how it would look - 

No alt text provided for this image

Seems Sorted!!! Wait a minute... 

  • How to enrich the data on flight?
  • What about data warehouse we talked about earlier. So, another consumer?  

Hmm... Let’s revise this.    

We now introduce our data warehouse which gets reference data feed continuously from other systems and needs to be available in flight to enrich the payload. At the same time, each transaction needs to be recorded at the data warehouse for analytics. So, this is both a producer and consumer.  

No alt text provided for this image

Ok. Let’s count the producers and consumers now.   

3 Producers:  

  • Store System 
  • Website or Online system
  • Snowflake  

3 Consumers:  

  • Slyp
  • OnePass 
  • Snowflake  

Wonder what could help here! Yes, you guessed it correctly earlier in this blog (if you read it!). Introducing Confluent Kafka. Confluent Kafka allows us to do all the above by using several components of Kafka stack.   

In our solution, we used Confluent Kafka as it is a managed service. We didn’t have to worry about cluster maintenance and uptime. We could focus more on creating solutions than maintaining the infrastructure.    

Coming back to the solution, we had to   

  • Enrich the payload
  • Transform the payload 
  • Standardize the payload based on consumer  

To achieve all of these we used Kafka Streams application.

No alt text provided for this image

Cool, seems sorted. Well, not yet! There are few things we still need to handle - 

  1. How would Store and Website send data to Kafka in a secure manner using APIs?  
  2. How to send and receive data from snowflake? 
  3. How will data be sent to External systems with fault tolerance?   

Let’s tackle each of these one at a time! 

How would Store and Website send data to Kafka in a secure manner using APIs?  

Here we used 2 components:

  • Kafka Rest Proxy – To receive request from API gateway and relay it to Kafka.  
  • AWS API gateway – to manage our APIs securely.   

Kafka Rest Proxy 

Kafka rest proxy allows us to use RESTful interface to publish messages to Kafka cluster.  

Instead of writing custom producer applications, a rest proxy allows seamless connectivity from clients to Kafka cluster to publish or even consume messages.  

This Kafka rest proxy is from Apache and is hosted through a docker container on AWS EKS 

However, the Rest proxy does not offer advanced security posture and so we make use of API gateway to manage our APIs in a secure manner.    

AWS API Gateway 

  • We set up a private API gateway service offered by AWS and make it available for our producers who also happen to be collocated in AWS to send their payload.  
  • This is coupled with AWS WAF (web application firewall) which allows us to secure us infra by whitelisting only known sources i.e., Store Systems & Websites. No other source system or application in the organization can send data to Kafka unless they are whitelisted.  
  • Further, we make use of token-based lambda authorizers which verifies the caller's identity through the bearer token they send with the request.  
  • This is connected to Azure AD where tokens are generated and then they are stored in AWS secrets manager which expires every hour. The client applications get the tokens from the secrets manager and caches it for the hour.  
  • These tokens are automatically rotated every hour to keep the communication fully secure.  
  • Needless to mention, All API calls are handled with SSL encryption so the data in transit is encrypted and secure.  

A diagram of this step -

No alt text provided for this image

How to send and receive data from snowflake?  

To handle data exchange to and from snowflake we make use of the following components.  

Send data to Snowflake -

  • We make use of Kafka connect framework here. Confluent offers fully managed snowflake sink connectors which allows Kafka messages to be sent to snowflake. These connectors are configured with password rotation policies to ensure no other processes get access to write data to snowflake.  
  • Once data is written in a variant column in Snowflake, a series of Airflows tasks in a dag are executed to convert this data from variant to tabular structure and further load it into a data mart for analysis and reporting purposes. 
  • For analysis, we write both raw payload and final payload to snowflake.  

Receive data from Snowflake -

No alt text provided for this image

This is handled by the enriching microservice Kafka stream application which queries a Snowflake view to enrich data of the payload and write it back to a Kafka topic.  

 

Time to tackle the 3rd question  

How will data be sent to External systems with fault tolerance? 

So far, we were playing in our own ground and now was the time to send the prepared payload out of the perimeter.   

  • To send the data to external services we developed microservices based on the consumer external endpoint (Slyp and OnePass).  
  • This is a Kafka consumer application which reads data from a specific topic and sends it to external endpoint. 
  • This service worked on oAuth authentication with a JWT token in the request for authentication. 
  • The tokens were generated and shared back with us hourly using another API at our end. Without going too deep into this, let’s understand that the token gets written to AWS secret manager and our microservice picks it up from the secret manager to attach to the payload, similar to what we explained earlier.   

It makes sense and seems good. But few pertinent questions -  

Our feed is real time, what if the external service goes down (for whatever reasons)? What happens to our feed? Should we stall the process or keep sending the messages?   

Here is what we did...  

We introduced a circuit breaker pattern.   

  • We developed a microservice which kept polling on the external endpoints for its availability.  
  • This service kept checking the heartbeat of external services.  
  • If the microservice receives heartbeat, it does nothing. Hunky dory yeah!
  • But if the heartbeat is not received, even after 3 exponential backoff re-tries, this microservice will suspend the other microservice which is receiving data from Kafka and sending data to external endpoints. 
No alt text provided for this image
  • In case the external service failed while we were sending the data and circuit breaker didn’t suspend the microservice, the failed message is written back to Kafka in a retry topic.  
  • Since these are consumer applications, it retains the last offset value which was processed and holds it until the next time service is resumed.  
  • The heartbeat service also sends an alert to the external systems and internal teams that the services are down and raises incidents automatically.  
  • Now, once the external service is back online, the heartbeat service detects it and activates the suspended microservice which starts with any message from the retry topic and the main topic.  
  • Once the external services receive the payload, they generate the formatted receipt which is written on the OnePass application or send an SMS to the customer with the link of the html receipt.  

Putting all of it together  -

No alt text provided for this image

Some Stats:  

  • Overall Latency – 10 Secs. Customer receives the digital receipt in less than 10 seconds 
  • Throughput – 1000 messages/sec. This architecture can process up to 1000 messages/sec. However, the transit, network bandwidth and token exchanges take up to 10 sec to deliver the message to the customer.  

This is how we achieved real-time secure data integration for generating smart digital receipts for our customers. It wasn’t easy to develop such complicated architecture and have it running in production. Kudos to the Anko team who worked on it day and night to achieve such a masterpiece. This is what we call Legend (Wait for it) Archy – LegendArchy 😎

Authors:  

Vivek Sinha - Manager Data Engineering, Enterprise Technology

jimmy lamba - Manager Data Engineering, Enterprise Technology

Share

Related posts

June 22, 2024

From Forecasting to Fulfillment: Anko GCC’s Demand Planning Expertise


Read more
October 5, 2023

Automation is in our DNA, and we are future-ready!


Read more
August 4, 2023

Anko Workplace Culture: Empowerment in Action!


Read more

Based in Bengaluru, India, Anko is the global capability center for Kmart Group Australia. We fuel customer experience for iconic Australian retail brands - Kmart and Target.

Navigation
Home
Culture
Diversity & Inclusion
About
Careers
Contact
Statutory Information
CSR
Annual Return
Privacy Policy
© 2025 Anko | All Rights Reserved