Documentation Index
Fetch the complete documentation index at: https://docs.neurometric.ai/llms.txt
Use this file to discover all available pages before exploring further.
Setup Guide
There are several ways to run the Inference Studio locally, depending on your workflow.Using Docker Compose (Recommended)
The easiest way to get started is using Docker Compose, which sets up the Next.js application, PostgreSQL database, and LocalStack for an S3 simulation:Services:
- App: Running at http://localhost:3000 (Environment variable
AWS_S3_EXPERIMENT_BUCKETis automatically set tolocal-bucket) - PostgreSQL: At
localhost:5432with userpostgres/ passwordpostgres - LocalStack (S3): Available at http://localhost:4566
- AWS Access Key ID:
test - AWS Secret Access Key:
test - Default region:
us-east-1 - S3 bucket data persists in
./s3-bucket/directory - Default bucket
local-bucketis automatically created on startup
- AWS Access Key ID:
Interacting with LocalStack S3 Bucket:
You can interact with the LocalStack S3 bucket using AWS CLI commands via Docker:DATABASE_URL uses db as the hostname. The connection string is automatically set to: postgresql://postgres:postgres@db:5432/studio_development
Hybrid Setup (Database in Docker, App Locally)
You can run the database in Docker while running the application locally for faster development:Security & Encryption (MASTER_KEY)
The application usesMASTER_KEY to encrypt sensitive API credentials (like Langfuse API keys) stored in the database.
Generate a new key:
- Must be exactly 64 hexadecimal characters (256 bits).
- Must be the same across all application and worker instances.
- Set it as an environment variable (
MASTER_KEY=your_64_character_hex_string_here).
docker-compose.yml includes a default development key. For production, set MASTER_KEY in your environment before running.