SL
SemiLayer · Live Demo
ask anything. no backend.

0 food products. Understood, not just indexed.

One publishable key. One food_products lens. Three endpoints that let you ask in words, ask in shape, or subscribe to a composed feed — each over plain fetch(), no SDK, no backend of our own. Pick a surface below to see the same data through a different window.

Meaning, not keywordsStructured when you need itComposed into live feeds
POST /v1/search

Semantic search

Plain-English queries over the embeddings. 'Organic oat milk', 'something sweet for breakfast' — the layer understands intent, not keywords.

  • One endpoint
  • Context-aware ranking
  • Filter + limit + order inline
Try it
POST /v1/query

Typed query

Ask in shape. Filters, orderBy, limit, offset — your lens becomes a read-only API over structured rows. Useful for grids, admin, pipelines.

  • All standard SQL-ish filters
  • Typed rows, mapped fields
  • Stable pagination
Try it
POST /v1/similar

Similar

Pass any record id, get its nearest neighbors in embedding space. No embedding API call — the seed's vector is already stored. Chain-click to explore.

  • Zero embedding cost per call
  • Click a result to re-seed
  • Declarative: which fields drive similarity
Try it
POST /v1/feed

Feeds

Declarative ranking — similarity, recency, engagement, diversity. Likes stay on your device; the feed re-ranks live as you interact.

  • Three named feeds on one lens
  • "More like this" via recordVector
  • Likes in localStorage, no server write
Try it

About the dataset

The food_products lens points at a slice of the Open Food Facts open dataset — brands, categories, tags, descriptions, prices, inventory. Enough shape to show how a real product catalog behaves through SemiLayer: one config block up front, then every surface above composes against the same underlying table. Search, filter, and feed are all the same data.