Client Context
A tech company that happens to deliver groceries.
A leading European online grocery platform has rebuilt supermarket delivery around an app-only model, electric vehicle logistics, and tightly optimised operations. It serves more than 2 million customers across 3 European markets without relying on physical stores.
The business had already grown into one of Europe's most advanced grocery technology platforms, with more than USD 1 billion in total funding and a catalogue of roughly 15,000 products. The next step was far more ambitious: scaling toward 100,000 SKUs while building customer and category intelligence on top.
The ambition was not just better product metadata. The client wanted the foundations for a digital category manager that could reason about assortment gaps, substitutes, and customer intent at grocery scale.
The Challenge
A manually built ontology could not keep pace with catalogue growth.
The client's long-term goal was to build an AI-powered "digital category manager" capable of gap analysis, product assortment decisions, and customer-facing intelligence such as recipe suggestions and shopping assistants. None of that works without a rich product ontology.
The team had started building that ontology manually inside a CRM-based product information management system using GS1 data and SPARQL queries. That approach worked at small scale, but it was never going to survive a sevenfold catalogue expansion.
Manual modelling did not scale
With roughly 1,500 fourth-level categories across more than 12,000 articles, manually defining attributes for each product class was too labour-intensive to maintain.
Source data was incomplete
GS1 feeds often arrived with missing allergen data and other critical fields, leaving gaps in the product detail the client wanted to expose to customers.
Edge cases broke flat taxonomies
The classic "spicy mayo" problem exposed the limitation: when a product is unavailable, the best alternative depends on nuanced product relationships that simple category trees cannot express.
Catalogue growth was accelerating
The range was expected to grow from roughly 15,000 to 100,000 products by the end of 2026, widening the gap between manual ontology work and what the business needed.
The bottleneck was not the idea. It was that no one inside the business had a workable way to automate ontology creation at the quality bar the client needed.
The Platform
60x deployed AI Brain as the knowledge graph foundation.
Rather than treating the ontology as a one-off consulting deliverable, 60x deployed the work on top of AI Brain, its proprietary knowledge graph platform. The system is designed to turn siloed company data into a structured, searchable intelligence layer that gives agents the context they need to operate inside a specific business domain.
Connect
AI Brain can ingest GS1 feeds, CRM data, product imagery, web data, and other enterprise sources through scheduled pipelines without disrupting the client's existing systems.
Structure
The platform generates ontology structure dynamically, extracting metadata, embedding product context, and building graph relationships from semantic similarity rather than relying on a fully hand-authored model.
Retrieve
Instead of wiring dozens of tools into every agent, AI Brain routes retrieval through a compact knowledge graph interface so performance does not degrade as the data estate grows.
For this engagement, the platform's dynamic ontology generation capability was the key. The problem the client had struggled to automate was already a native fit for the architecture 60x had spent 18 months refining.
What We Built
Automated ontology generation benchmarked against known-good outcomes.
60x configured AI Brain to generate product ontology structure automatically by working backward from the client's validated "golden datasets" for alternatives and substitutes. Instead of manually defining every class top-down, the system learns what ontology structure is required to reproduce correct downstream outcomes, then applies that logic at scale.
Data ingestion and enrichment
GS1 product data and product images are ingested in parallel, while enrichment agents add web context, competitor signals, and visual attributes.
Dynamic ontology generation
The system creates product classes, categories, subcategories, and attributes using tree-based tagging, semantic embeddings, and confidence scoring to handle edge cases.
Three-layer evaluation
Every generated ontology is checked through rules, AI-as-judge review, and human validation so quality can be measured before deployment.
- Golden dataset benchmarking: success is measured against the client's hand-curated alternatives and substitutes rather than abstract model scores.
- Batch and real-time retrieval: weekly precomputation supports alternatives, while embeddings-based search enables low-latency discovery.
- AWS ownership model: the production architecture is scoped to run on the client's own AWS stack, including Lambdas and SageMaker.
Downstream Use Cases
One ontology, multiple grocery intelligence systems.
The ontology is not the end product. It is the shared intelligence layer that powers multiple agentic use cases across the catalogue.
Next best alternative
When a product is unavailable, the system can identify the most appropriate replacement based on actual product relationships rather than broad category matching.
Product substitutions
A stricter substitution layer evaluates allergen compatibility, dietary requirements, and equivalence constraints against the client's higher-bar benchmark set.
Intelligent product search
Embeddings-based retrieval makes natural-language shopping queries possible, such as finding "something spicy for tacos" across multiple product classes in real time.
Digital category manager
The long-term vision is a system that can analyse assortment gaps, recommend category changes, and support promotional decisions across a much larger grocery universe.
The Scale
A proof of concept designed for continental catalogue growth.
Products in the proof of concept
Current catalogue size
Target catalogue by end of 2026
The initial proof of concept focuses on the client's "table sauces" category, where a high-quality golden dataset already exists. That makes it possible to validate the methodology on approximately 1,500 products before scaling across the wider catalogue.
The strategic test is simple: if the system can reproduce known-good alternatives and substitutions here, the same architecture can extend across a much larger and more complex catalogue.
Engagement Timeline
From discovery to proof-of-concept kickoff in one quarter.
- December 2025: initial discovery around the digital category manager vision, current ontology pain points, and the client's broader data architecture.
- January 2026: deep technical scoping across GS1 data, the CRM-based PIM, product imagery, evaluation requirements, and future AWS deployment constraints.
- January 2026: 60x presented the multi-step ingestion, enrichment, ontology, and evaluation pipeline, with the "spicy mayo" edge case helping clarify the value of the approach.
- Q1 2026: both sides aligned on a proof of concept focused on table sauces, performance-based delivery terms, and benchmarking against golden datasets.
- Present: the proof of concept is in build, with ontology generation being validated against the client's alternatives and substitutes datasets.
Why 60x
The client bought speed, platform maturity, and a path to ownership.
AI Brain was already battle-tested
60x did not pitch a blank-sheet experiment. The client got a knowledge graph platform already proven across multiple enterprise use cases.
Delivery risk was shared
Performance was tied to agreed benchmarks against golden datasets, aligning incentives on a first engagement where outcome quality mattered more than slideware.
The system was designed to be handed over
From day one, 60x scoped the architecture around the client's own AWS environment so their engineering team can own, maintain, and scale the system independently.
The significance of the work is not just faster ontology creation. It is the ability to turn a rapidly growing grocery catalogue into a reusable intelligence asset that supports substitutes, search, category management, and future customer-facing AI experiences.