341 0


Q-Commerce: 75-85% growth expected as 5 million new users dash to the market

Quick Commerce (Q-Commerce) is rapidly transforming India’s retail sector with its focus on fast delivery. This service, which promises swift access to groceries, gadgets, and more, has become integral for urban consumers. Last year, Q-Commerce grew by 70-75%, significantly outpacing traditional e-commerce. According to Redseer, Q-Commerce is projected to reach a market size of $6 BN by FY 2025, with expected growth rates of 75-85%. This surge is driven by the addition  of about 5 MN new Monthly Transacting Users (MTUs) and a projected 20% increase in spending per MTU. The rise in spending is linked to increased confidence in Q-Commerce platforms and their expanding product range, including beauty items and home decor. User behaviour also shows notable changes. Monthly orders per user are set to increase from 4.4 to nearly 6. Additionally, Average Order Values are expected to rise by 15% due to promotions and a broader product selection. Metro cities dominate the sector, accounting for about 90% of the market share in FY 2024. Historical trends suggest that user growth might stabilize around 20 million MTUs by FY 2026. Expanding into smaller cities poses challenges that may require changes to the current dark-store model. Utilizing India’s Kirana network for less densely populated areas could be a potential solution, though this remains uncertain.

E-commerce and consumer internet industry welcomes FM Sitharaman’s Budget

E-commerce and consumer internet industry has welcomed tax breaks and incentives in Finance Minister Nirmala Sitharaman’s Budget. Kalyan Krishnamurthy of Flipkart Group, said the Budget is a bold move to enhance India’s human and organizational capital. He said it spares no effort in incentivising the youth to upskill and join the formal workforce. He said the provisions of the Budget will remove bottlenecks in the supply chain and lend significant support to industries by encouraging clusters, establishing e-commerce export hubs, and improving credit flow to MSMEs (micro, small, and medium enterprises). “Reducing the TDS (tax deducted at source) rate from 1% to 0.1% will significantly free up working capital for sellers. Development of DPI (digital public infrastructure) applications for urban governance shows the government’s focus on improving and easing daily life,” said Krishnamurthy. Amazon said that on behalf of its sellers, the e-commerce firm welcomes the Government of India’s decision to reduce the tax deducted at source (TDS) rate from 1% to 0.1% for e-commerce operators. It said this significant policy change promises substantial working capital relief for micro, small, and medium enterprises (MSMEs). By reducing the financial burden on these businesses, Amazon said this not only facilitates ease of doing business but also strongly encourages the adoption of digital commerce. Amazon also said that it welcomed India’s government announcement that it would set up e-commerce export hubs across the country through public-private partnerships.

WhatsApp cuts business messaging prices to counter SMS, Google’s RCS

Meta’s WhatsApp cut business messaging prices by 16-97% in several countries, including by 63% in India, in an aggressive attempt at countering growing competition from Google’s Rich Communication Services (RCS) and Apple’s upcoming foray into the segment. In India, at Rs 0.11 per conversation, WhatsApp’s new pricing for utility messaging is even cheaper than traditional SMS services at Rs 0.12-Rs 0.15, and also RCS, which is priced at Rs 0.20-Rs 0.25. “Effective August 1, 2024, we are lowering rates to be competitive with alternative channels and encourage businesses to bring more end-to-end post purchase customer journeys to WhatsApp,” the company said. The reduced rates are, however, only applicable for one-time authentication (OTPs), order/delivery managements, account updates, payment reminders, or feedback surveys. For marketing and sales messages, WhatsApp has, in fact, increased the rate by 8% to Rs 0.78 per conversation to prevent the platform from becoming a spam channel. The revised pricing will be applicable effective October 1. Experts said WhatsApp’s sharp price cut is likely to impact both messaging channels, especially RCS which is still emerging in India. Some say it could severely affect traditional SMS services with WhatsApp already capturing a major share of the corporate messaging market. Others believe that banks and government departments will continue to use SMS for universal coverage. “SMS is directly targeted by this pricing change…RCS could see a more direct competitive impact if businesses choose WhatsApp for its broader reach and lower pricing,” said Shradha Thapa, regional head – OTT India, Infobip “Regions with a high penetration of WhatsApp users and growing internet access, such as Latin America, parts of Asia, and Africa are likely to see increased volumes,” she said. WhatsApp has nearly 500 MN users in India, its single-largest market globally.  Aniketh Jain, founder of customer communications startup Fyno, said, “Unlike SMS, which charges for each 160-character message, WhatsApp’s charges cover a 24-hour conversation. This move could further push SMS closer to Obsolescence.” However, Beerud Sheth, CEO of Gupshup, a global leader in conversational platform services, said RCS is a very different ecosystem compared to WhatsApp. “While WhatsApp’s pricing and policies are determined by one company, Meta, RCS pricing and policies are individually decided by hundreds of telecom operators on their respective networks.”

Meta unveils biggest Llama 3 AI model, touting language and math gains

Meta Platforms released the biggest version of its mostly free Llama 3 artificial intelligence models, boasting multilingual skills and general performance metrics that nip at the heels of paid models from rivals like OpenAI. The new Llama 3 model can converse in eight languages, write higher-quality computer code and solve more complex math problems than previous versions, the Facebook parent company said in blog posts and a research paper announcing the release. With 405 BN parameters, or variables that the algorithm takes into account to generate responses to user queries, it dwarfs the previous version released last year though is still smaller than leading models offered by competitors. OpenAI’s GPT-4 model, by contrast, is reported to have one trillion parameters and Amazon is preparing a model with 2 trillion parameters. Promoting Llama 3 across multiple channels, Chief Executive Mark Zuckerberg said he expected future Llama models would overtake proprietary competitors by next year. The Meta AI chatbot powered by those models was on track to become the most popular AI assistant by the end of this year, with hundreds of millions of people using it already, he said. The release comes as tech companies are racing to show that their growing portfolios of resource-hungry large language models can deliver significant enough gains in known problem areas like advanced reasoning to justify the gargantuan sums that have been invested in them. Meta’s own top AI scientist has said he believes such models will hit up against limits on reasoning and that other types of AI systems will be needed to produce breakthroughs. In addition to its flagship 405 BN parameter model, Meta is also releasing updated versions of its lighter-weight 8 BN and 70 BN parameter Llama 3 models initially introduced in the spring, the company said. All three new models are multilingual and can handle larger user requests via an expanded “context window,” which Meta’s head of generative AI, Ahmad Al-Dahle, said would improve the experience of generating computer code in particular.

PhonePe, Google Pay cede online payment share to new entrants

Walmart Inc.-owned PhonePe and Alphabet Inc.’s Google Pay saw a marginal decline in their share of total transactions processed through India’s unified payments interface in June as smaller players gained traction. PhonePe’s share in total UPI transactions dropped to 48.37% from 48.67% in May. Google Pay’s portion shrunk to 36.76% from 37.18% in the previous month, according to data released by the National Payments Corporation of India. Overall, the UPI network processed 13.88 BN transactions in June, a 1% drop over the prior month. Operated by state-backed NPCI, UPI is a system that allows users to make instant money transfers by linking banks with fintech apps such as Paytm, PhonePe and Google Pay. The UPI payments space has seen an influx of new players in recent times. The entrants are challenging the dominance of PhonePe and Google Pay, which control more than 80% share of the market. For instance, UPI transactions on Axis Bank Ltd.’s apps grew 17% to 75 MN in June, while the Navi app saw a 20% increase to 35.7 MN. Additionally, Flipkart, the Indian e-commerce giant and PhonePe’s sister company, launched a separate UPI payments service called super.money. Mukesh Ambani’s Jio Financial Services also entered the fray with its JioFinance app to tap into India’s burgeoning fintech ecosystem. Reeling under regulatory setback, Paytm retained its 8% share from last month, signaling a halt in market erosion from 13% at the start of the year.

Centre slashes digital payments incentives by over 42% for FY25 in Union Budget

The allocation for fintech startups and banks for promoting RuPay debit cards and low-value BHIM-UPI P2M (person-to-merchant) transactions stood at INR 2,485 Cr in FY24. Moreover, funds for promoting digital payments were completely absent from the Union Budget 2024-25. Earlier this month, Amazon Pay India CEO Vikas Bansal said that “some sort” of MDR regime was essential for smaller UPI players to receive their fair share

Comscore outlines opportunities for advertisers amid recent cookie news

While Google has announced a change to its cookie deprecation plans in Chrome and the Privacy Sandbox, the industry will see continued consumer-oriented privacy opt-outs driven by regulations and by browser or operating system settings, according to Comscore. Google’s announcement doesn’t signal a return to January 2020, when the company first proposed eliminating cookies within two years, as much has changed in the past four years. Regardless of Google’s strategy, it has been noted that the core needs of marketers and publishers remain largely unchanged: to address their audiences’ needs and meet consumer expectations.Comscore, in its latest blog notes that the suspension of Chrome’s cookie deprecation does not impact the continued rollout of digital audience measurement methodology. The market trend is rapidly moving towards increased privacy controls and ongoing signal loss. This shift has already occurred without Chrome’s removal of third-party cookies, and it is likely to continue regardless of Google’s actionsThe efforts made over the last five years to prepare for a cookieless future will remain crucial. Advertisers are already navigating an omnichannel environment where many essential platforms, such as CTV, TV, mobile, and social media, are naturally cookieless. “Advertisers have been adapting to a more complex cross-media landscape where multiple identifiers, consent requirements, and signals govern media spend, and the endurance of the cookie does little to change this secular trend. The fundamentals of reach, frequency and incremental performance remain the bedrock of optimization,” as per Brian Pugh, Chief Information Officer, Comscore.The demand for omnichannel measurement and ID-free advertising solutions is as urgent as ever, today and into the foreseeable future, he continues.

Small Language Models: The next frontier in enterprise AI

In the not-so-distant past, natural language processing (NLP) was regarded as too intricate for modern artificial intelligence (AI). However, the landscape changed dramatically in November 2022 when OpenAI introduced ChatGPT. Within a week, it garnered more than a million users, making AI accessible to the masses and marking the beginning of a loud, public, and costly Generative AI (GenAI) revolution. The rapid evolution of AI capabilities over the past few years has been driven by advances in large language models (LLMs). While LLMs offer impressive capabilities, their massive size leads to efficiency, cost, and customizability challenges. This has paved the way for the rise of small language models (SLMs). SLMs are more streamlined versions of LLMs, featuring fewer parameters and simpler designs. They have become attractive to enterprises due to their control, cost-efficiency, and ability to fine-tune for specific domains and ensure data security. “Applications like ChatGPT based on LLMs created a frenzy about the possibilities of new age AI. However, as enterprises started to rush towards adapting it in their different products and services, they faced the business reality of cost vs revenue impact. In their quest to reduce cost while maintaining the same or acceptable quality of output, they came across SLMs,” said Ganesh Sahai, CTO of Nagarro. Recognising the trend, companies such as Apple, Microsoft, Meta, and Google are now focusing on developing smaller AI models with fewer parameters while maintaining robust capabilities.Several factors drive the shift towards smaller models, primarily to address concerns surrounding the adoption of LLMs. In April, Microsoft introduced Phi-3, a family of open AI models that are capable and cost-effective small language models. Hugging Face also launched SmolLM, a new series of compact language models optimized for use on local devices like laptops and phones, eliminating the need for cloud-based resources and significantly reducing energy consumption. Language models are AI systems trained on large text datasets, enabling text generation, document summarisation, language translation, and question answering. Small language models fill the same niche but with notably smaller model sizes, typically under 100 million parameters, with some models as small as 1 million parameters.For example, Meta’s Llama 3 boasts an 8 billion parameter model that rivals larger models like GPT-4. Similarly, Microsoft’s Phi-3-small model, with 7 billion parameters, outperforms previous versions of OpenAI’s model. The smaller sizes make SLMs more efficient, economical, and customisable than their larger counterparts. One significant advantage of SLMs is their ability to run on devices with limited processing power, such as smartphones or IoT devices. This edge computing capability contrasts sharply with larger models requiring powerful cloud infrastructure, making SLMs accessible to entrepreneurs and smaller organizations.

Budget 2024: Govt reduces mobile custom duty by 15% paving way for more affordable smartphones

In her 2024 budget speech, Finance Minister Nirmala Sitharaman revealed a significant tax reduction of up to 15% on mobile phones, mobile PCBs, and chargers. This move is expected to make smartphones more affordable and boost domestic manufacturing. Sitharaman highlighted the maturity of the Indian mobile industry and emphasized that this tax cut will benefit original equipment manufacturers (OEMs) by lowering production costs within India. The government had previously made strides to support mobile phone manufacturing by reducing import taxes on key components like camera lenses and extending a lower tax rate on lithium-ion batteries, which are crucial for both mobile phones and electric vehicles.


 

In this article

Join the Conversation

thirteen − five =