Getting Started
This guide will help you set up Retrieva for local development.
Prerequisites
Before you begin, ensure you have:
- Node.js 20+ (
node --version) - npm 10+ (
npm --version) - Docker and Docker Compose (for infrastructure services)
- Azure OpenAI resource with API access
Infrastructure Setup
1. Start Required Services
The platform requires MongoDB, Redis, and Qdrant. Use Docker Compose:
docker-compose up -d
This starts:
- MongoDB:
mongodb://localhost:27017 - Redis:
redis://localhost:6378 - Qdrant:
http://localhost:6333
2. Azure OpenAI Setup
The platform uses Azure OpenAI for LLM and embeddings. You'll need:
- An Azure OpenAI resource with deployments for
gpt-4o-miniandtext-embedding-3-small - API key and endpoint from Azure Portal
See Environment Variables for detailed configuration.
Backend Setup
1. Install Dependencies
cd backend
npm install --legacy-peer-deps
The --legacy-peer-deps flag is required due to some peer dependency conflicts with Express 5.
2. Configure Environment
Copy the example environment file:
cp .env.example .env
Edit .env with your configuration:
# Server
PORT=3007
NODE_ENV=development
# MongoDB
MONGODB_URI=mongodb://localhost:27017/enterprise_rag
# Redis
REDIS_URL=redis://localhost:6378
# Qdrant
QDRANT_URL=http://localhost:6333
QDRANT_COLLECTION_NAME=documents
# Azure OpenAI (REQUIRED)
LLM_PROVIDER=azure_openai
EMBEDDING_PROVIDER=azure
AZURE_OPENAI_API_KEY=your-azure-openai-api-key
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
AZURE_OPENAI_LLM_DEPLOYMENT=gpt-4o-mini
AZURE_OPENAI_EMBEDDING_DEPLOYMENT=text-embedding-3-small
AZURE_OPENAI_API_VERSION=2024-02-15-preview
# JWT (generate with: openssl rand -base64 48)
JWT_ACCESS_SECRET=your-super-secret-jwt-key
JWT_REFRESH_SECRET=your-super-secret-refresh-key
# Encryption Key (generate with: openssl rand -hex 32)
ENCRYPTION_KEY=your-64-character-hex-encryption-key
# Frontend URL (for OAuth redirects)
FRONTEND_URL=http://localhost:3000
# Compliance monitoring
MONITORING_INTERVAL_HOURS=24
INSTITUTION_NAME=Financial Entity
# Email (optional for local dev — emails will be skipped if not set)
RESEND_API_KEY=
SMTP_FROM_NAME=Retrieva
RESEND_FROM_EMAIL=noreply@yourdomain.com
3. Seed the Compliance Knowledge Base
Before running DORA assessments, seed the DORA articles into the compliance knowledge base:
npm run seed:compliance
This command is idempotent — safe to run multiple times.
4. Start Development Server
npm run dev
The backend will be available at http://localhost:3007.
5. Verify Setup
Check the health endpoint:
curl http://localhost:3007/health
Expected response:
{
"status": "success",
"message": "Service is healthy",
"data": {
"status": "up",
"timestamp": "2024-01-01T00:00:00.000Z",
"uptime": 123.456
}
}
Frontend Setup
1. Install Dependencies
cd frontend
npm install
2. Configure Environment
cp .env.example .env.local
Edit .env.local:
NEXT_PUBLIC_API_URL=http://localhost:3007/api/v1
NEXT_PUBLIC_WS_URL=http://localhost:3007
NEXT_PUBLIC_APP_NAME=Retrieva
3. Start Development Server
npm run dev
The frontend will be available at http://localhost:3000.
Running Tests
Backend Tests
cd backend
# Unit tests
npm run test:unit
# Integration tests
npm run test:integration
# All tests
npm test
Frontend Tests
cd frontend
npm run test
Common Commands
Backend
# Development with auto-reload
npm run dev
# Production
npm start
# Qdrant utilities
npm run qdrant:list # List documents
npm run qdrant:info # Collection info
npm run qdrant:collections # List collections
# Seed compliance knowledge base
npm run seed:compliance
npm run seed:compliance:reset # Wipe and re-seed
Frontend
# Development
npm run dev
# Build for production
npm run build
# Lint
npm run lint
Troubleshooting
Azure OpenAI Connection Issues
If Azure OpenAI fails to connect:
- Verify your API key is correct in
.env - Check the endpoint URL format:
https://your-resource.openai.azure.com - Ensure deployments exist for both LLM and embedding models
- Check Azure Portal for rate limiting or quota issues
# Test Azure OpenAI connectivity
curl -X POST "${AZURE_OPENAI_ENDPOINT}/openai/deployments/${AZURE_OPENAI_LLM_DEPLOYMENT}/chat/completions?api-version=2024-02-15-preview" \
-H "Content-Type: application/json" \
-H "api-key: ${AZURE_OPENAI_API_KEY}" \
-d '{"messages":[{"role":"user","content":"Hello"}]}'
MongoDB Connection Issues
# Check MongoDB status
docker exec -it rag-mongodb mongosh --eval "db.adminCommand('ping')"
# View logs
docker logs rag-mongodb
Redis Connection Issues
# Check Redis status
docker exec -it rag-redis redis-cli ping
# View logs
docker logs rag-redis
Qdrant Issues
# Check Qdrant health
curl http://localhost:6333/health
# List collections
curl http://localhost:6333/collections
First Login and Onboarding
Retrieva uses an organization-first B2B model. Every user must belong to a company account before accessing the dashboard.
Scenario A — Creating a new organization (first user / CRO)
- Register at
/register— no invite token needed - You are redirected to
/onboarding - Fill in your company name, industry, and country → submit
- You land on
/assessmentsand the sidebar shows your company name above the workspace switcher - Go to Settings → Team to invite colleagues
Scenario B — Joining via invite (team member)
- Admin invites you via Settings → Team → Invite → your email address
- You receive an email: "Maria invited you to join HDI Global SE on Retrieva"
- Click the link →
/join?token=XXX - If you're not registered yet, you're redirected to
/register?token=XXX&email=you@company.com— the email is pre-filled - Complete registration → you land directly on
/assessments(no/onboardingstep) - All vendor workspaces of the organization are immediately visible
Your org role (org_admin, analyst, viewer) maps to workspace permissions automatically. See Organizations API for the full mapping table.
Next Steps
- Architecture Overview — Understand the system design
- Organizations API — Team onboarding and invitation endpoints
- API Reference — Explore available endpoints
- Background Workers — BullMQ worker reference
- Environment Variables — Full configuration reference