AnyaSelf Docs

Quickstart

How to spin up the entire AnyaSelf stack locally.

Prerequisites

Before running the AnyaSelf stack, ensure you have the following installed on your machine:

Running Locally

AnyaSelf uses a docker-compose.yml file to spin up all 8 microservices, including the API Gateway, Orchestrator, Wardrobe, and VTO.

Configure Environment Variables

Copy the sample environment file to configure your local secrets.

cp .env.example .env

Ensure you inject valid GCP_PROJECT_ID and GCP credentials into your .env file before proceeding. The Orchestrator Vertex AI Agent will crash on boot without these (if ORCHESTRATOR_REQUIRE_VERTEX_AGENT=true).

Build and Start Containers

Run the following command from the root of the new_project directory to build the images and start the services in detached mode:

docker-compose up --build -d

Verify Services are Running

You can check the status of the containers by running:

docker-compose ps

Once running, the API Gateway will be accessible at http://localhost:8080.

Service Port Mapping

When running locally, the services are mapped to the following ports for easy debugging:

ServicePortDescription
api-gateway8080Main entry point for External Auth and Gemini Websockets.
wardrobe8081Manages Items, Outfits, and Discover Feed endpoints.
commerce8002Tracks synced Fashion Offers and semantic search schemas.
orchestrator8003Hosts the LangChain Vertex AI Agent Missions.
vto8004Virtual Try-On inference (runs in simulation mode by default).
headless-cartprep8005Queues and manages headless checkout jobs.
hyperbeam-bridge8006Handles asynchronous Chromium sessions and Agent extension events.
artifacts-audit8007Ledger for generation plans, transcripts, and Household audits.

[!NOTE] By default, vto runs via VTO_INFERENCE_BACKEND=simulated and wardrobe runs via WARDROBE_STORAGE_BACKEND=stub. This allows you to run the stack without needing heavy cloud credentials for every single persistence or GPU layer.

On this page