Google has announced a new generation of AI shopping agents — autonomous software that can browse products across the internet, compare prices and specifications, negotiate with vendors, and complete purchases on a consumer's behalf without requiring step-by-step human direction. The consumer describes what they want, sets a budget, and the AI agent does the rest: searching, evaluating, selecting, and buying.

This is not a smarter search engine. It is a fundamentally different model of commerce. Instead of a human making purchasing decisions informed by technology, the technology itself is making purchasing decisions informed by a human's preferences. The distinction matters enormously for data privacy, consumer protection, and regulatory compliance — particularly in South Africa, where both POPIA and the Consumer Protection Act create obligations that were never designed with autonomous AI intermediaries in mind.

For South African retailers, platform operators, and any business that processes consumer data, the arrival of AI shopping agents raises questions that need answers now — not after these systems are already operating at scale in the local market.

Hear this discussed on Priviso Live

This article is based on the discussion from Episode 71, where we examine the privacy and consumer protection implications of AI-powered automated commerce in South Africa.

How AI Shopping Agents Actually Work

To understand the regulatory implications, you need to understand the mechanics. An AI shopping agent is not simply a price comparison tool. It is an autonomous decision-making system that operates through several stages, each of which involves distinct data processing activities.

First, the agent builds a consumer profile. This includes explicit preferences (the user says "I want running shoes under R2,000") and inferred preferences drawn from browsing history, past purchases, location data, device information, and potentially data from other Google services. The richer the profile, the better the agent's recommendations — and the more personal information being processed.

Second, the agent navigates the open web, visiting retailer websites, reading product descriptions, comparing prices, checking reviews, and evaluating availability. In doing so, it interacts with retailer systems in ways that may not be distinguishable from human browsing — raising questions about consent, cookies, and terms of service.

Third, the agent makes a purchasing decision. It selects a product, chooses a vendor, and initiates a transaction. This is the critical step: a machine, not a human, is making a commercial decision that binds the consumer to a contract and commits their financial resources.

Fourth, the agent completes the transaction, providing payment details, delivery addresses, and any other information required to finalise the purchase. This necessarily involves sharing the consumer's personal information with third-party retailers and payment processors.

POPIA Section 71: The Automated Decision-Making Problem

Section 71 of POPIA is the provision most directly relevant to AI shopping agents, and it is one of the least tested sections of the Act. Section 71(1) provides that a data subject may not be subject to a decision which results in legal consequences for them, or which affects them to a substantial degree, if that decision is based solely on the automated processing of personal information intended to provide a profile of the data subject.

An AI shopping agent that builds a profile of a consumer's preferences, financial capacity, and purchasing patterns, and then autonomously makes a purchasing decision, is doing precisely what Section 71 was designed to regulate. The decision — to buy a specific product from a specific vendor at a specific price — has direct legal consequences: it creates a contractual obligation and commits the consumer's funds.

The critical question is whether the consumer's initial instruction ("buy me running shoes under R2,000") constitutes sufficient human involvement to take the decision outside the scope of Section 71, or whether the agent's autonomous selection of a specific product, vendor, and price point remains a decision "based solely on automated processing." South African courts have not yet addressed this question. International precedent is sparse and inconsistent.

"When an AI agent decides which product to buy, from which vendor, at which price — that is not a search result. That is a commercial decision with legal consequences, and POPIA has something to say about it."

The safer interpretation for businesses is that Section 71 applies unless the consumer has specifically approved the particular transaction — not merely set general parameters. This means AI shopping agents operating in South Africa may need to implement a confirmation step before completing purchases, undermining the frictionless experience that makes them commercially attractive.

Consent Architecture: Who Consented to What?

POPIA's consent framework creates additional complications. Under Section 11, personal information may only be processed if the data subject consents, and that consent must be specific, informed, and voluntary. When a consumer activates an AI shopping agent, they presumably consent to the agent processing their data for the purpose of finding and purchasing products. But the chain of consent becomes murky quickly.

Did the consumer consent to their personal information being shared with every retailer the agent visits? Did they consent to their purchasing profile being used to infer their financial status, health conditions (from pharmacy purchases), or lifestyle choices? Did they consent to their data being transferred to servers in jurisdictions with weaker data protection laws, which is almost certain given Google's global infrastructure?

The principle of purpose limitation under Section 13 is equally challenging. Personal information collected for the purpose of completing a specific purchase cannot lawfully be repurposed for advertising, market research, or training AI models without additional consent. But AI shopping agents, by their nature, learn from every interaction. Each purchase refines the model. Drawing the line between "processing to complete this transaction" and "processing to improve future recommendations" is technically and legally complex.

The Consumer Protection Act: Does the CPA Apply to AI Buyers?

South Africa's Consumer Protection Act 68 of 2008 (CPA) was designed to protect human consumers making human decisions. It grants rights to fair and honest dealing (Section 40), protection against unconscionable conduct (Section 40), the right to cooling off after direct marketing transactions (Section 16), and protection against unfair contract terms (Section 48).

When an AI agent makes a purchase, several questions arise. Does the cooling-off period apply? Section 16 grants a five-day cooling-off period for transactions resulting from direct marketing. If an AI agent responds to a targeted promotion on a retailer's website, is that direct marketing? The CPA's definition focuses on communication directed at specific consumers — but the "consumer" in this case is a software agent, not a person.

The right to examine goods before purchase (Section 18) is also problematic. A human shopper can inspect a product before committing. An AI agent makes decisions based on descriptions, specifications, and reviews — essentially digital representations. If the product does not match expectations, the consumer's remedy may depend on whether the AI agent's assessment was "reasonable," a standard that has no precedent in South African consumer law.

Perhaps most importantly, Section 41 of the CPA prohibits false, misleading, or deceptive representations. If a retailer optimises its website to appeal to AI agents rather than human consumers — padding product descriptions with keywords that influence AI selection algorithms, for example — is that a deceptive representation? The CPA assumes the audience is a human consumer exercising human judgment. When the audience is an algorithm, the entire framework of "misleading" needs reinterpretation.

Data Sharing with AI Intermediaries: The Third-Party Problem

Under POPIA, when personal information is shared with a third party (an "operator" in POPIA terminology), the responsible party must ensure adequate contractual safeguards are in place (Section 21). The responsible party remains accountable for how the operator processes the data.

In the AI shopping agent model, the data flows are complex. Google processes the consumer's personal information to power the agent. The agent shares information with retailers to browse and purchase. Retailers share information with payment processors. Payment processors share information with banks. At each step, personal information is being transmitted, processed, and stored by different entities in different jurisdictions under different legal frameworks.

For South African retailers, the question is whether accepting a purchase from an AI shopping agent constitutes receiving personal information from Google as an operator, or whether the transaction creates a direct relationship between the retailer and the consumer. The answer determines who bears the POPIA compliance burden — and it is not clear.

What South African Retailers Must Prepare For

AI Commerce Readiness Checklist for SA Retailers

  1. Review your terms of service. Do your website terms contemplate automated agents browsing and purchasing? Most do not. Update terms to address whether AI agents are permitted, what data they may collect, and what obligations apply to transactions initiated by non-human actors.
  2. Update your POPIA processing notices. If AI agents will be submitting personal information on behalf of consumers, your processing notices must disclose how that information will be used, stored, and shared. The consumer may never visit your website directly — they may only interact through the agent.
  3. Implement transaction verification. Consider requiring a human confirmation step for purchases above a certain value or for first-time customers whose transactions are initiated by AI agents. This provides a defensible position under Section 71 of POPIA.
  4. Audit your data flows. Map how consumer data enters your systems when a purchase is made through an AI intermediary. Identify where data is stored, who has access, and whether cross-border transfers occur. Ensure operator agreements are in place with all parties in the chain.
  5. Prepare for CPA disputes. When a consumer is unhappy with a purchase made by their AI agent, the retailer will be the first point of contact. Develop clear policies for returns, refunds, and disputes arising from AI-initiated transactions, and train customer service teams accordingly.
  6. Monitor for AI-targeted manipulation. As AI shopping agents become prevalent, bad actors will attempt to game them — just as SEO manipulation targets search engines today. Implement monitoring to detect unusual patterns in AI-initiated traffic and purchases.
  7. Engage with industry bodies. The regulatory framework for AI commerce in South Africa is undeveloped. Engage with the National Consumer Commission, the Information Regulator, and industry associations to shape the rules before they are imposed.

The Profiling Question: When Personalisation Becomes Surveillance

AI shopping agents work better when they know more about the consumer. Google's competitive advantage in this space is its unparalleled access to consumer data across search, email, maps, YouTube, Android, and Chrome. An AI shopping agent powered by this data can infer not just what you want to buy, but your income level, your health concerns, your family situation, your political leanings, and your daily routines.

POPIA Section 71 specifically addresses decisions based on profiling — the automated processing of personal information intended to evaluate certain personal aspects of a data subject, including their economic situation, personal preferences, interests, reliability, or behaviour. An AI shopping agent that uses this kind of profile to make purchasing decisions is squarely within Section 71's scope.

The Information Regulator has not yet issued guidance on how Section 71 applies to AI agents. But the European experience with GDPR Article 22 — the analogous provision — suggests that regulators will take a broad view. The European Data Protection Board has consistently held that automated decisions with "legal or similarly significant effects" include decisions about what products and services are offered to a consumer and at what price. If an AI agent negotiates a higher price for a consumer it profiles as affluent, that is a significant effect.

International Precedents and Where SA Stands

The EU's AI Act, which entered into force in 2024, classifies AI systems used in consumer interactions as requiring transparency obligations. Consumers must be informed when they are interacting with an AI system, and when AI is making decisions that affect them. The Act does not yet specifically address AI purchasing agents, but its principles clearly apply.

In the United States, the Federal Trade Commission has signalled that AI agents acting as consumer intermediaries will be held to the same standards as human agents — meaning that unfair or deceptive practices by an AI agent can expose the operator to liability.

South Africa has neither framework. The closest we have is POPIA's general provisions on automated decision-making and the CPA's consumer protection principles. This is not necessarily a disadvantage — South Africa has the opportunity to develop a framework that learns from international experience rather than replicating it. But only if regulators, industry, and civil society engage with the issue before AI shopping agents are a fait accompli.

Key Takeaways

Key Takeaways for South African Businesses and Consumers

  • Google's AI shopping agents represent a shift from technology-assisted human decisions to fully autonomous commercial decisions made by AI on a consumer's behalf.
  • POPIA Section 71 restricts decisions based solely on automated profiling that have legal consequences — AI purchasing decisions almost certainly fall within this scope.
  • Consent under POPIA must be specific and informed — a blanket "use AI to shop for me" may not satisfy the requirements for each data processing activity in the chain.
  • The Consumer Protection Act was designed for human consumers making human decisions — its application to AI-initiated transactions is untested and uncertain.
  • Data sharing between AI agents, retailers, and payment processors creates complex third-party processing chains that require POPIA-compliant operator agreements.
  • South African retailers should update terms of service, processing notices, and transaction verification processes before AI shopping agents arrive at scale.
  • AI-targeted manipulation of product listings and pricing is an emerging risk that retailers and regulators must monitor.
  • South Africa has no AI-specific commerce legislation — the regulatory gap is an opportunity for proactive engagement, not a reason for complacency.

Prepare Your Business for AI-Powered Commerce

Priviso helps South African businesses navigate the intersection of data privacy, consumer protection, and emerging AI technologies. Start with a comprehensive compliance assessment.

Start Free Trial Contact Us