Categories
Digital Analytics

Mobile Applications’ Analytics

The increasing penetration of smart phones along with better internet connectivity and reach has led to a growth in businesses looking at mobile apps to engage with their customers. Mobile apps enable marketers to reach out to potential customers at the right time, at the right place and if done correctly – with the right offer.

As with every opportunity, come a host of challenges – creating an engaging mobile application, driving app installs, ensuring engagement and regular app usage and estimating if the focus and expenses incurred are worth the investment. Collecting user-app interaction data is vital to mobile strategy success. Analytics can be used effectively to measure the performance of a mobile strategy and measure the pulse of the customers.

How to track mobile app analytics?

There are a host of tools out there, we prefer Mixpanel, Firebase and iOS App Analytics

Basic analytics for mobile applications

a. Download and Install statistics: Allows businesses to measure the growth of user base, types of devices, location and application ratings

b. Usage statistics: Measures the engagement levels, usefulness and the ability of the app to keep users engaged

c. Purchase Statistics: A direct measure of how many purchases, cart, abandonment etc. are being driven through the app

d. Crash and Application not responding (ANR) statistics: Indicates the performance of the application across devices and mobile operating systems

Mobile app data can provide businesses with a lot of information about the users and the app usage. These data, when converted into insights can help businesses design:

1. Ads to drive awareness and app downloads

2. Push notifications on searched/related products

3. Recommendation of new or unseen products

4. Personalized promotions

We present a few KPIs that should be tracked at the minimum to allow businesses generate actionable insights for better business outcomes:

1. Conversion* rate: Allows businesses to measure the fraction of users who are getting converted through the mobile app

*Conversion is defined as an intended outcome and not necessarily a signup or purchase

2. Traffic Heatmaps: Measures the traffic on each screen and the time spent by each user on a screen

3. Crash & ANR: Measures the crash and application not responding statistic

4. Users by App version: Provides statistic on number of users by version, indicating user lifecycle, engagement levels and few other proxies

5. Acquisition source: Helps a business monitor the source of downloads and the app download conversion rates

6. Retention: Allows businesses to measure the retention rates week on week, and can help distinguish seasonality, holiday behavior etc.

7. Demographics: Data on location, device, demographics and interests of the app user

8. Revenue: Revenue being generated by a mobile application

9. Event tracking: Data on automatically collected events and custom events, the number of times that event has been occurred and by how many user

Businesses should make use of data and mobile app analytics to generate insights about the mobile app usage. This can help them come up with new strategies to optimize the funnel for conversion and business growth.

Categories
IOT Analytics

Industry 4.0 – Quo vadis?

Industry 4.0 is the newest phase of Industrial Revolution being brought through faster wireless communications, real-time data streams, machine learning and AI driven automated decision systems and inter-connected devices and machines. It is also referred to as Industrial IoT or smart manufacturing. Industry 4.0 is revolutionizing decision making through real-time insights across production, demand forecasting, supply chain management and inventory management.

Evolution of Industry from 1.0 to 4.0

Manufacturing has evolved over the three centuries, and there have been four distinct phases of industrial transformation:

1. Industry 1.0: The invention of steam engines, and evolution of manufacturing from manual labour to optimized machines driven by steam powered engines. Happened in late 1700s and early 1800s

2. Industry 2.0: Brought about by the introduction of steel and use of electricity in manufacturing. Mass production concepts like assembly line formulated. Happened in early 1900s

3. Industry 3.0: Began with the introduction of computers in the manufacturing process. Focus shifted from the use of analog to digital technologies. Started in late 1950s

4. Industry 4.0: Started with the use of inter-connected machines (called Internet of Things or IoT). Has allowed connecting physical and digital machines. Started in 2010s

What is changing OR going to change in Industry 4.0?

Industry 4.0 is changing the manufacturing processes. We present a few use cases here:

1. Supply Chain Optimization: Manufacturers have the ability to integrate their manufacturing process with the supply chain platform. This provides them real-time insights, market trends and demands to deliver products at a faster and cheaper rate

2. Asset Monitoring: With enhanced tracking and monitoring capabilities using wireless connectivity, video based intelligent feeds and location based real-time data, manufacturers can identify supply chain issues and asset quality risks to optimize asset transfer, disposal and adjustments in real-time

3. Predictive Maintenance: Every manufacturer relies on machines, and the machines require regular maintenance to work efficiently. The different maintenance strategies are:

a. Reactive Maintenance: Repairs and maintenance done on machine failure

b. Preventive Maintenance: Failures are costly, and maintenance done at regular intervals to prevent failures

c. Predictive Maintenance: Condition based monitoring of operational parameters to predict failures, and carrying out the maintenance just before a failure is predicted to happen

Benefits of Industry 4.0

We list some (not all) benefits of adopting Industry 4.0 capabilities to your business:

1. Gives you the competitive edge

2. Increases collaboration and promotes shared decision making

3. Predictive analytics using real-time data reduces inefficient preventive maintenance costs and downtime

There are many other benefits, most of which do not fall under our skillset.

To outperform in today’s business environment, manufacturers need to embrace Industry 4.0 or risk falling behind.

Categories
Video Analytics

The promise of AI for Retail

The inertia of change is difficult to overcome, and today’s retailers are being challenged to change and accommodate the evolving expectations of shoppers. Shoppers are being offered conveniences of online shopping, personalized experience and unlimited choice of merchandise through digital channels. It is therefore an existential need for the retailers to enhance shopper experience.

Ecommerce share of retail sales is about 15% today (2020), though the share is increasing steadily over the years. It is therefore, still possible for retailers to reverse the trend and cover some lost ground. The rapidly advancing landscape of AI and Machine learning has the potential to help retailers improve the shopper experience.

Finding a way to beat digital convenience

It is obvious that purchasing an item online is far more convenient than driving to a retail store and making a purchase. Retailers need to understand that they do not need to beat e-commerce. They need to find ways to make shopper experience enriching enough to make the shopper willing to drive to their store. For retailers, AI and Machine Learning driven shopper experience enhancement promises to provide them with the opportunity to bridge the gap between digital and physical stores.

Smart Mirrors: Smart mirror technologies enables retailers to collect data on the products being tried and requests for different sizes or colors. It can also help retailers identify product pairings for promotions and bundling. The shopper’s details can be utilized to understand shopper preferences. Shopper’s browsing details and merge it with the purchase data to have better sense of the shopper.

Personalization: Retail spaces could use the biometric recognition capabilities of AI to extract shopper profiles, customized rewards and promotions – creating a personalized experience for the shopper. At Xtage Labs, we have created a basic proof of concept to show how AI could be utilized to identify the shopper and combining CRM, purchase database and loyalty card details to create custom promotions for each shopper

An important consideration though is the ability of retailers to secure sensitive biometric data and privacy concerns that may be counterproductive and discourage shoppers.

Extended Reality (XR): Extended reality (AR, VR and MR) promises to offer retailers with the capability to provide unique shopping experience that could change the face of retail. From browsing different variations like color or size of the products, to virtually “trying them on” or comparing two items together – XR promises to substantially improve the shopping experience

Real-time Stock Monitoring: Retailers could use camera feeds to monitor the sales velocity of their merchandise. Images of different sections of stocking units to generate counts in real-time. This will help retailers understand the product demand and anticipate shopper needs better.

Store layout as a Recommendation Engine: It is a well publicized fact that 1 in 3 purchases on Amazon are driven by recommendation systems utilizing deep learning based machine learning algorithms. Retailers could utilize a different machine learning algorithm to identify the products being bought together and optimize the store layout and put items bought together next to each other.

The future of retail is going to be more personalized. It is also important for retailers to understand that ecommerce is here to stay, and they need to find a way to enhance shopper experience. Retail store should provide experiences enabled through the use of AI and Machine Learning to excite customers and make them drive to the retail store.

Categories
Video Analytics

Profiling of shoppers using video feeds in an apparel retail store

The retail store chain client is facing challenges from e-commerce and other competitors. They are looking to take on the challenge and realise the importance of customer and shopper data. They know that the e-commerce players have access to browsing and other data that helps them engage with shoppers better. They want to setup a system that can help them capture similar (if not same) data from an offline retail perspective, enabling better decisions.

Dual use of video feeds to generate shopper insights

The client already had high-end CCTV cameras installed in their outlet for security purpose. They wanted to utilize the infrastructure to generate shopper insights.

Majority of shoppers are in the age range of 30-40 years

27% of the visitors are accompanied by companions or family

82% of shoppers interact with some product but do not make a purchase

The study revealed valuable insights on shopper behaviour inside the store and some additional insights on the shopper profiles that were previously unknown.

High cost of training. Data security risks

The retailer knows the importance of shopper data and The insights they can provide. They also know that in order to counter the ability of personalization on e-commerce platforms, they need to acquire additional data to build a similar solution. And given the offline selling model, they had a leverage that e-commerce platforms don’t have – opportunity to interact with the shoppers and understand their motivation, preferences and other insights.

The conceptualised solution had its share of challenges though. The most pressing was on data security due to privacy guidelines from the regulatory agencies to not store shopper biometrics. The second issue is with the shopper’s itself – who have a negative perception towards CCTV based intelligence. And lastly, the risk of the solution being misused due to security breaches.

Useful, but resource intensive

The piloted solution provided some valuable insights that are otherwise not available to the retailer.

For once, they knew the demographics of the shoppers through own data sources. The other benefit was knowing the products/items that shoppers interact with, most often.

However, there are some valid concerns on the commercial deployment:

a. The high cost (data infrastructure) of processing high quality video feeds as a data stream

b. Genuine concerns around regulatory compliance, data security protocols and biometric data usage

The client is currently assessing the risks vs. benefits and the legal implications of commercial deployment.

Categories
Text Analytics

The right chatbot for your business

To achieve success with chatbots, businesses need to know what they are looking for and be aware of the options available to them. Given the AI hype, the ideal chatbot should be AI powered, able to converse with the users in free-flowing text format. But do you even need a complex AI bot for your business?

It is much better to introduce a simpler bot first, and evolve the bot from there. The bots can be classified based on the underlying conversation engine:

1. FAQ chatbots: The simplest form of chatbots. The backend is fed with standard questions that a user can ask, and the responses to those questions. These bots do not store conversation history and work on the principals of answer and forget i.e. it will not have the context of the conversation if the user asks a second, related question. The bot is hardcoded, uses no AI and can be deployed very quickly. This type of bot is ideal for answering frequent inquiries

2. Menu based chatbots: This is another simple form of a chatbot where you have defined a scenario tree and have built in responses within the bot, much like an automated IVR. The bot does not use any form of AI and simple to build. This type of bot is ideal for tasks that are straightforward, but involve multiple steps

3. Conversational Chatbots: These are most advanced bot, and are also referred to as virtual assistants. They are designed to be interactive, handle open ended and free flowing inputs from the users. If built right, these bots can learn the context and intent behind a query, and have the potential to minimize the need for human customer support staff. These bots use Machine Learning & AI algorithms at the backend, learn from the conversation data and improve query resolution success rate over time.

4. Voice based Chatbots: Voice based bots are an extension of conversational bot, utilizing the most natural form of conversations i.e. voice. The progress of voice based bot is driven by the progress of speech recognition and linguistic technologies within AI, which is an ongoing research area. The success also depends on the language of the geographic area a business operates in

No matter the industry you operate in, you could deploy a chatbot that can automate some part of your business. It is however important to know your requirement, and assess the return on investment (RoI) that you could generate from the solution. The advanced AI based bots take more time to develop, as well as require significant investment.

Benefits of deploying chatbots:

1. Reduce customer wait time

2. 24×7 availability

3. Standardized, quality controlled engagement with customers

4. No limitations on query volumes and highly scalable

5. Improved return on investment with reduced human resource training, hiring and retention

At Xtage Labs, we have developed a set of guidelines to help you choose the right chatbot solution:

1. Identify your business requirements

2. Build the bot across your simplest use cases

3. Measure the key metrics like customer satisfaction, resolution rate etc.

4. Measure the bot return on investment (RoI) and assess if an advanced conversational chatbot or voice based chatbot makes sense

Building the first chatbot for your business and testing how your customers respond does not require a huge investment or time.

Chatbots are here to stay, and an early start can provide you with the competitive advantage. It is however important to understand that a chatbot is like your website or a mobile app – and you will need to keep enhancing the customer experience through the chatbot for continued success.

Categories
Data Science

Challenges in Data Science success

Modern businesses are collecting data at an unprecedented scale. And businesses are realizing the benefits of processing this data and generating insights. With the current hype surrounding data science and the bountiful returns being promised, advanced analytics and machine learning also run the risk of being misunderstood. Businesses should know few basic things about data science that determine the success or failure of any data science initiative.

Data Science workflow

The flowchart below shows a typical data science project implementation steps:

One of the most overlooked fact is that, typically 80% of the effort is focused on getting the right data and pre-processing it. The pie-chart below breaks down the time taken for building a data science solution:

Some key focus areas to achieve success in a data science initiative are:

1. Data Collection: Data collection is usually done by the IT team. Their usual focus is on standard KPIs, privacy, security and minimizing cost. In most cases, the IT team is not the end users of data, and miss the bigger picture – of why the data is being collected and the data that needs to be captured. They may not be capturing the right or complete data, leading to challenges in utilizing it later for insights and business decisions.

2. Identifying the business problem: “Well begun is half done” is true for data science initiatives too. A data science problem needs to solve a single, well-defined problem. Once the problem has been defined, the business needs to define the frequency and granularity of output

3. Right data in right format: The businesses generate a lot of data. It is important to know the right data to use for a business problem. If you collect too much data, the cost of data collection and storage may put strain on your finances. Collect too less data, and the business may not be able to generate the right insights. It is therefore important to know the data that needs to be collected

4. Integrating data science output in the business process: One of the major obstacles in a successful implementation of data science initiative is to get the output in the correct form. Often, data science outputs are used by business teams, having little or no knowledge of the underlying algorithms and assumptions. Business teams cannot be expected to reach out to the data scientists every time a decision needs to be made. It is upon data scientists to provide the model output in a format and through an application that enables business teams to use those output in their decision making processes

5. Measuring return on investment (RoI): Businesses want to quantify the benefits of data science in $ terms. It is however difficult to assign a dollar figure to the model’s decisions. Measuring impact of a data science initiative is complex and requires a well-planned approach to measuring the benefits – both tangible and intangible

6. Selling Data Science: Data science team has a supporting role, providing the decision makers with the right insights when they need it. It is therefore important to highlight how they are driving success for the organization. It is important to note that data science is a difficult concept to understand, and decision makers generally do not care about what data science model is being used. It is therefore important to find a way to quantify the benefits in a language that the business leaders understand – that is usually in terms of dollars saved or additional revenue being generated. Data science team need to plan the way they are going to communicate the benefits.

It is important to note that we have talked about the major challenges that are commonly observed across organizations. This does not mean that a business is going to face only these challenges. It is the responsibility of the data science team to communicate the challenges, manage the expectations of the decision makers and keep them realistic.

Categories
Digital Analytics

How to perform A/B Tests for Digital Assets?

In our last blog, we presented the scope of A/B Testing, what it is and why digital platforms should focus on A/B Testing. In this blog, we will talk about how to perform A/B Tests.

How to perform an A/B Test?

A/B testing allows systematic way to find out what is working on a digital platform and what isn’t. Driving traffic to a digital platform is hard enough, and therefore providing the best digital experience to maximize the chances of conversion is of paramount importance. A/B Testing allows digital platforms to maximize conversions and identify the issues hindering conversions. It is important for digital platforms to create a structured and continuous A/B Testing plan, and not look at A/B Tests as a one-time activity. The steps in A/B Testing involves:

1. Research: Before initiating an A/B Test, it is important to create benchmarks i.e. how the platform is performing currently. Data points related to user visits, most visited pages, conversion goals of each page etc. The clicks and browsing behavior using standard heatmap tools can provide insights on time being spent on different sections of a page

2. Hypothesis Design: The next step then is to define a hypothesis aimed at increasing conversion. A sample hypothesis could be: A title having the product USP (unique selling point) leads to increased clicks on ‘View Details’ button

3. A/B Test Cases: The two versions of the webpage – based on the hypothesis in the previous step need to be created. Continuing with the previous example, the existing convention of product title is the ‘control’ and a page with title having the product USP would be the variation

4. Run Tests: Launch the test and let visitors generate sufficient data to help you arrive at statistically significant results.

There are primarily four types of testing: A/B Testing, Split Testing, Multivariate Testing and Multipage Testing. You need to identify the right test based on your experiment goals.

5. Analysis & Conclusion: Once sufficient data has been generated, analysis of results allows you to arrive at a data-driven conclusion i.e. which variation of the test is better. It might be possible that the test is inconclusive, in which case you would need to learn from the test and implement changes so that subsequent test(s) can provide clear winners

Mistakes to avoid while A/B Testing

1. Invalid or poor formulation of hypothesis: A/B Test case can only be created against a hypothesis you want to test. A poorly/wrongly formulated hypothesis will take you nowhere

2. Testing multiple elements in one A/B Test: A/B Test is mostly used to test one variation at a time – so that the differences can be measured. Too many variations may lead to ambiguous results

3. Not measuring statistical significance: The difference between two versions should be statistically significant to make the conclusions actionable

4. Unbalanced Traffic: Traffic should be balanced and not biased towards a certain type of visitors

5. Running a test for insufficient duration: Running an A/B Test for inappropriate time may result in insignificant conclusions

6. Accounting for external factors: A/B Tests should be avoided on days with higher traffic, holidays etc. to avoid skewed results

A/B Tests, if used well, can significantly improve the return on investment (RoI) of marketing campaigns. It helps the digital platform owners identify existing problems, address them and reach towards the desired conversion goals

Categories
Non-Profits

Quality assessment of ~120,000 public schools

The client is a UK based non-profit agency working with a state education department in central India. They had designed a program to assess the quality of ~120,000 public schools based on specific parameters. They were looking for a technology driven solution that would allow them to manage the flow of information from distant, rural areas to the state headquarters in a seamless and near real-time basis. They also wanted generation of insights and reports to be automated.

Technology to the rescue

The non-profit was grappling with a lot of challenges owing to the scale and the landscape of the program.

Offline data entry for usage of the app at locations with no internet connectivity

Bi-lingual platform for a higher uptake by school staff in rural areas

Shift from paper based entry, collection, reporting enabling real-time insights

There were couple of other challenges – designing a web application intuitive enough for first time users, no password (but secure) login based system

Large program with challenging infrastructure

The non-profit had signed up an MoU with the state Education department in central India to assess all public schools on quality, and to provide recommendations from cluster level to state level – to improve the school quality. They had developed a survey that was focused on eight different parameters. They had planned for a paper based survey, but soon realised the logistical challenges and the cost implications of the entire exercise.

There were multiple challenges to even execute the program through tech enabled solution. Internet connectivity in remote and rural areas was a challenge. Then there was the issue of English language competency. The biggest challenge of all was that most users were not familiar with using a laptop or a smart phone. As a consequence, a standard password based login was not effective as well.

Program Intelligence at scale

The school quality assessment program was a program intelligence solution requirement that challenges the limits of technology.

Once functional, numerous benefits were driven through the application:

a. Allowed results to be reflected in real-time

b. This data was then collated and insights were generated automatically through the system at the desired administrative level

c. Allowed for fast, click-of-few-buttons based comparison of one admin level against other

d. Served as a repository for all surveys completed across regions – without the need to preserve paper based records.

Categories
Customer Analytics

Unit Economics, CAC & LTV estimations for profitability

The used car marketplace startup wanted to understand direct revenues and the costs associated with their business model i.e. a buyer and a seller of used cars. As a startup, they needed to understand the sustainability of their business model and if tweaks were required. They also wanted to understand the most (as well as) least profitable segments, and use this analysis to focus on the right customers to move towards profits

To the point cash flow analysis

The customer insights study was focused on identifying the key units – in this case the buyers and sellers, and measure business performance metrics associated with the units

Calculated metrics of retention, transactions and fees at unit level

Measured the customer acquisition cost (CAC) for retained vs new customers

Performed latent class segmentation & identified customer groups

The study helped the startup identify the value of offers they could make to their customers – both buyers and sellers, and which segments would turn profitable during their life time

Customer insights for profitability & cash management

The startup had raised a significant capital in their first round of funding, that was focused on onboarding as many buyers and sellers to their marketplace. With desired growth levels, their focus turned towards sustainability and scaling up the business model. Direct revenues and costs on business units was on top of their mind to drive profitability
Our client needed more than unit economics study to get the answers they were looking for i.e. not just a measurement of retention, new customer acquisition, # of transactions and fees – but an understanding of customer acquisition costs, their lifetime value and the segments that were profitable and the ones that were driving the cost. With our integrated customer analytics study, they understood all that. And more!

Highlight opportunities. Expose gaps

Unit economics looks at direct revenues and the costs associated with basic units of a business model

Combining unit economics with customer acquisition cost (CAC) and segmentation, we are able to answer some key questions for the startup:

a. How sustainable was there customer acquisition strategy

b. Key buyer and seller segments that were profitable and the ones that were money drainers

c. Whether the revenue model being pursued was working, and if any re-alignment is required

The strategy and leadership team, armed with the insights from the study, realised some key challenges to the business model they were following.

Categories
Marketing Analytics

Pricing strategy deep dive to align with market and competitors

This client was looking for hierarchical price optimization strategy – combining the heterogeneity of individual countries, and aggregation at the region level at the same time. This would help them define and achieve revenue goals at both – country and region level. A simulator would allow them to test out different scenarios. The solution needed to be scoped for country level elasticities and simulation for aggregated region level sales goal.

Self service simulation for different pricing tactics

Product elasticity, cross product elasticity, and product lifecycle allowed us to estimate the impact of changing pricing levers with greater certainty

~2% (avg.) margin improvement across Top 5 brands

12%-17% improvement in forecast accuracy across Top 5 brands

6% reduction in inventory over 6-month observation period

The study enabled our client to constantly monitor business targets and review the pricing strategy based on quarterly performance

Pricing strategy & Optimization

A pricing strategy is a method to discover the best
price for a product. But finding the right price is not as simple as the definition sounds. The client was using competitive pricing strategy – focusing on the existing market price of competitors. They were not considering their own COGS, as their assumption was that the market is saturated, and it is better to focus on the price of competitor products.

They needed a better approach to pricing – one that
would maximize their profits and revenue. The client liked our approach of state space modeling to measure the stochastic components along with the use of multiplicative models to compute elasticities. To top it all, a self-service web-based pricing simulator would enable them to test out different pricing tactics

Improved revenue through strategy recommendations

Summary insights were fed into a dashboard that allowed simulation of pricing changes on contracts, recontracts and margin.

Final results were delivered through price elasticity dashboard. User inputs can be:

a. Target revenue
b. Desired margin
c. % changes in # of stores, among other factors

We overlaid business rules and constraints, competitive benchmark on top of elasticity based optimization.

High level business benefits include optimal markdown scenario, improved revenue & better liquidity. The client now creates benchmarks every quarter.