Skip to main content

πŸ“Š SuperSkin Services: Data & API Strategy

Version: 2.0 Date: January 2026 Status: βœ… IMPLEMENTED Focus: 6 production services with sport-aware ML predictions


Executive Summary​

This document details the data sources, APIs, and implementation approach for the 6 SuperSkin services that are now fully implemented. The architecture features:

  • Grok (xAI) as the primary LLM with native X Search and Web Search
  • Sport-aware ML models for football, cricket, and tennis
  • Graceful fallback to LLM intelligence for unsupported sports (basketball, esports, etc.)
  • Multi-source data from Oracle Platform (real-time) and Forsyt Data Machine (historical)

Key Architecture: ML models augment Grok's intelligence for supported sports. For unsupported sports, Grok provides its own analysis using odds-implied probabilities and its knowledge base.


βœ… What's Built (Current State)​

Core Infrastructure​

ComponentLocationStatusPurpose
Oddspapi Integrationoracle-platform/βœ… BUILTReal-time odds from 300+ bookmakers
Redis Pub/SubSuperSkin servicesβœ… BUILTEvent distribution & caching
TimescaleDBSuperSkin infraβœ… BUILTTime-series data for charts
PostgreSQLSupabase (IceCrystal)βœ… BUILTHistorical match data warehouse

SuperSkin Services (All Implemented)​

#ServicePortStatusPurpose
1Price Feed Aggregator3100βœ… BUILTNormalized odds aggregation
2Cash Out Calculator3101βœ… BUILTReal-time cash-out quotes
3AI Value Detection3102βœ… BUILTShin/Poisson/ELO value signals
4AI Chat Assistant3103βœ… BUILTGrok-powered RAG assistant
5Trading Charts3104βœ… BUILTOHLC candlestick data
6ML Prediction Service3105βœ… BUILTSport-specific ML inference

ML Training Pipeline (Offline)​

ComponentStatusPurpose
Data Loaderβœ… BUILTLoad football/cricket/tennis from Supabase
Feature Engineeringβœ… BUILTSport-specific feature creation
Model Trainingβœ… BUILTXGBoost, Random Forest, Neural Network
Model Registryβœ… BUILTAuto-discovery of sport-specific models

πŸ—ΊοΈ 6 Services Architecture​

#ServiceInputOutputML Support
1Price Feed AggregatorOddspapi oddsNormalized prices β†’ RedisN/A
2Cash Out CalculatorPositions + pricesCash out quotesN/A
3AI Value DetectionPrices + historicalValue signalsShin/Poisson/ELO
4AI Chat AssistantUser queriesAnalysis + predictionsβœ… Football/Cricket/Tennis
5Trading ChartsPrice historyOHLC endpointsN/A
6ML Prediction ServiceMatch featuresProbabilitiesβœ… Football/Cricket/Tennis

Sport-Aware ML Architecture​


πŸ“‹ Service Implementation Details​

1️⃣ Price Feed Aggregator Service​

Purpose: Normalize odds from multiple bookmakers into a unified format

What it consumes:

Oddspapi WebSocket β†’ odds:updated:soccer channel
Redis Cache β†’ odds:soccer:{matchId}

What it produces:

Redis Pub/Sub β†’ prices:normalized:{sport}:{matchId}
Redis Cache β†’ prices:best:{matchId} (best back/lay per outcome)

Data Structure (Normalized Price):

interface NormalizedPrice {
matchId: string;
sport: 'soccer' | 'cricket' | 'tennis';
timestamp: Date;
outcomes: {
[outcomeId: string]: {
name: string; // "Home", "Draw", "Away"
bestBack: { odds: number; bookmaker: string; limit: number };
bestLay: { odds: number; bookmaker: string; limit: number };
consensus: number; // Weighted average
allPrices: { bookmaker: string; back: number; lay: number }[];
};
};
margin: number; // Calculated overround
sharpness: number; // How close to Pinnacle
}

Implementation Steps:

  1. Subscribe to odds:updated:* from Redis Pub/Sub
  2. For each update, fetch full odds from cache
  3. Apply normalization (consensus calculation already exists in odds-aggregation.service.ts)
  4. Publish to prices:normalized:* channel
  5. Cache best prices with 5s TTL

2️⃣ AI Value Detection Service​

Purpose: Identify bets where our odds offer positive expected value

What it consumes:

Redis Pub/Sub β†’ prices:normalized:{sport}:{matchId}
Internal odds β†’ From matching engine (Forsyt order book)

What it produces:

Redis Pub/Sub β†’ value:detected:{matchId}
WebSocket β†’ value:alert (to subscribed users)

Value Detection Algorithm:

// Calculate fair probability from sharp books (Pinnacle-weighted)
function calculateFairOdds(normalizedPrice: NormalizedPrice): number {
const pinnacleWeight = 0.6;
const betfairWeight = 0.25;
const consensusWeight = 0.15;

// Remove margins, calculate implied probabilities
// Weight by sharpness
// Return fair decimal odds
}

// Calculate edge
function calculateEdge(ourOdds: number, fairOdds: number): ValueAlert {
const edgePercent = ((ourOdds - fairOdds) / fairOdds) * 100;

return {
matchId,
outcome,
ourOdds,
fairOdds,
edge: edgePercent,
confidence: calculateConfidence(dataQuality),
recommendation: edgePercent > 3 ? 'STRONG_VALUE' :
edgePercent > 1 ? 'VALUE' : 'NO_VALUE'
};
}

Confidence Scoring:

  • Number of bookmakers (more = higher confidence)
  • Pinnacle included (yes = +20% confidence)
  • Time to event (closer = lower confidence due to volatility)
  • Historical accuracy of this market type

3️⃣ AI Chat Assistant Service (Port 3103)​

Purpose: Natural language betting assistance with Grok-powered intelligence

What it consumes:

User context β†’ Balance, positions, bet history (from Supabase)
Market data β†’ All live matches, odds (from Redis cache)
Value detection β†’ Current value opportunities (from AI Value Detection)
ML predictions β†’ Sport-specific predictions (from ML Prediction Service)

What it produces:

Chat responses with:
- Match analysis with ML predictions (when available)
- Value bet recommendations
- Cash out suggestions
- Real-time X Search for news/injuries
- Web Search for additional context

LLM Provider Architecture:

PriorityProviderModelFeaturesCost
1Grok (xAI)grok-3X Search, Web Search, Function Calling$5/1M tokens
2Groqllama-3.3-70bFast inference, Function CallingFREE (30 req/min)
3Ollamallama3:8bLocal, UnlimitedFREE
4OpenAIgpt-4o-miniReliable backup$0.15/1M tokens

Grok-Specific Features:

Sport-Aware Response Flow:

User: "Analyze Lakers vs Celtics tonight"
↓
Grok identifies: Basketball (unsupported sport)
↓
Grok calls: get_market_prices(match="Lakers vs Celtics")
↓
Grok calls: x_search("Lakers Celtics injury news")
↓
Grok calls: web_search("Lakers Celtics betting preview")
↓
Response: Analysis using odds-implied probabilities + X/Web context
(No ML predictions - basketball not supported)
User: "Analyze Liverpool vs Chelsea"
↓
Grok identifies: Football (supported sport)
↓
Grok calls: get_ml_predictions(sport="football", match="Liverpool vs Chelsea")
↓
Grok calls: get_value_signals(match="Liverpool vs Chelsea")
↓
Response: Analysis with ML predictions (72% confidence) + value signals

4️⃣ Trading Charts Backend​

Purpose: Store and serve historical price data for charting

What it consumes:

Redis Pub/Sub β†’ prices:normalized:{sport}:{matchId}
Every price update gets stored

What it produces:

REST API endpoints:
- GET /charts/{matchId}/ohlc?interval=1m&from=&to=
- GET /charts/{matchId}/depth (order book depth history)
- GET /charts/{matchId}/trades (matched order history)

Data Storage (TimescaleDB recommended):

-- Price ticks table (hypertable)
CREATE TABLE price_ticks (
time TIMESTAMPTZ NOT NULL,
match_id TEXT NOT NULL,
outcome_id TEXT NOT NULL,
best_back DECIMAL(10,3),
best_lay DECIMAL(10,3),
consensus DECIMAL(10,3),
volume DECIMAL(15,2)
);

SELECT create_hypertable('price_ticks', 'time');

-- OHLC continuous aggregates
CREATE MATERIALIZED VIEW ohlc_1m
WITH (timescaledb.continuous) AS
SELECT
time_bucket('1 minute', time) AS bucket,
match_id,
outcome_id,
first(best_back, time) as open,
max(best_back) as high,
min(best_back) as low,
last(best_back, time) as close,
sum(volume) as volume
FROM price_ticks
GROUP BY bucket, match_id, outcome_id;

Alternative: PostgreSQL with pg_partman (if TimescaleDB not available)


5️⃣ Cash Out Calculator Service​

Purpose: Calculate real-time cash out quotes for open positions

What it consumes:

User positions β†’ From database (stake, odds, outcome)
Live prices β†’ From Redis cache (current best back/lay)

What it produces:

REST API:
- GET /cashout/{positionId}/quote
- GET /cashout/user/{userId}/all

WebSocket:
- cashout:quote:updated (when prices change)

Cash Out Calculation (already documented in codebase):

interface CashOutQuote {
positionId: string;
originalStake: number;
originalOdds: number;
currentOdds: number;
cashOutValue: number;
profitLoss: number;
returnPercent: number;
}

function calculateCashOut(position: Position, currentPrice: NormalizedPrice): CashOutQuote {
// For BACK positions: Cash out by laying at current price
// Formula: (Original Odds / Current Lay Odds) Γ— Stake

const originalImpliedProb = 1 / position.odds;
const currentImpliedProb = 1 / currentPrice.bestLay.odds;

// If probability increased (more likely to win), cash out is profitable
const cashOutValue = (position.odds / currentPrice.bestLay.odds) * position.stake;

return {
positionId: position.id,
originalStake: position.stake,
originalOdds: position.odds,
currentOdds: currentPrice.bestLay.odds,
cashOutValue,
profitLoss: cashOutValue - position.stake,
returnPercent: ((cashOutValue - position.stake) / position.stake) * 100
};
}

Cash Out Options:

OptionDescriptionImplementation
Full Cash OutClose 100% of positionSingle calculation
Partial Cash OutClose X% (25/50/75%)Scale proportionally
Auto Cash OutTrigger at target valueBackground job monitors
Cash Out LockGuarantee minimum valueEarly exit protection

6️⃣ ML Prediction Service (Port 3105)​

Purpose: Serve sport-specific ML models for match outcome predictions

What it consumes:

Match features β†’ From Supabase (historical data)
Real-time odds β†’ From Price Feed Aggregator
Model files β†’ ONNX models from training pipeline

What it produces:

REST API:
- POST /predict (match features β†’ probabilities)
- GET /models (list available models)
- GET /health (service health + model status)

Sport-Aware Model Registry:

# Model discovery and loading
class ModelRegistry:
def __init__(self):
self.models = {}
self._discover_models()

def _discover_models(self):
"""Auto-discover models by sport"""
for sport in ['football', 'cricket', 'tennis']:
model_path = f"models/{sport}_ensemble.onnx"
if os.path.exists(model_path):
self.models[sport] = onnxruntime.InferenceSession(model_path)

def predict(self, sport: str, features: dict) -> dict:
if sport not in self.models:
return {"error": f"No model for {sport}", "supported": list(self.models.keys())}

# Run inference
session = self.models[sport]
probabilities = session.run(None, features)
return {"probabilities": probabilities, "sport": sport}

Supported Sports:

SportModel StatusFeatures UsedAccuracy
Footballβœ… TrainedOdds, ELO, Form, H2H~68%
Cricketβœ… TrainedOdds, Venue, Toss, Form~65%
Tennisβœ… TrainedOdds, Rankings, Surface, H2H~70%
Basketball❌ Not trained--
Esports❌ Not trained--

Graceful Degradation:

When a sport is not supported, the service returns a structured response that allows Grok to provide analysis using its own intelligence:

{
"sport": "basketball",
"model_available": false,
"supported_sports": ["football", "cricket", "tennis"],
"fallback_suggestion": "Use odds-implied probabilities and Grok analysis"
}

πŸ”Œ Existing API Integration (Already Built)​

Oddspapi (PRIMARY DATA SOURCE)​

Already integrated in oracle-platform:

  • REST API: odds-aggregation.service.ts - Fetches odds from 40+ bookmakers
  • WebSocket: oddspapi-websocket.service.ts - Real-time streaming
  • Caching: 5 minute TTL for odds, 5 second TTL for live prices

Endpoints Available:

EndpointPurposeAlready Using?
GET /v4/oddsFetch odds for fixtureβœ… Yes
GET /v4/bookmakersList bookmakersβœ… Yes
GET /v4/fixturesList fixturesβœ… Yes
WebSocket /v4/wsReal-time updatesβœ… Yes

Bookmaker Sharpness (for Value Detection):

BookmakerSharpness ScoreUse For
Pinnacle1.0 (reference)Fair value baseline
Betfair Exchange0.95Market consensus
Bet3650.7Popular market
William Hill0.65Soft book comparison

πŸ“Š Data Flow Architecture​

Current Flow (Already Working)​

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ CURRENT DATA FLOW (BUILT) β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ β”‚
β”‚ Oddspapi WS ──→ oddspapi-websocket.service ──→ Redis Cache β”‚
β”‚ ↓ ↓ β”‚
β”‚ Redis Pub/Sub ←────────────────────── β”‚
β”‚ ↓ β”‚
β”‚ redis-match-subscriber.service (backend) β”‚
β”‚ ↓ β”‚
β”‚ Socket.IO ──→ Frontend β”‚
β”‚ β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

New Services Integration​

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ NEW SERVICES INTEGRATION β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ β”‚
β”‚ Redis Pub/Sub (odds:updated:*) β”‚
β”‚ ↓ β”‚
β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
β”‚ ↓ ↓ ↓ ↓ β”‚
β”‚ Price Feed Charts Cash Out Value Detection β”‚
β”‚ Aggregator Backend Calculator Service β”‚
β”‚ ↓ ↓ ↓ ↓ β”‚
β”‚ prices:* TimescaleDB cashout:* value:* β”‚
β”‚ ↓ β”‚
β”‚ AI Chat Assistant β”‚
β”‚ β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

🎯 Implementation Status (All Complete)​

Build Order (Completed)​

Phase 1 (Complete):
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ 1. Price Feed Aggregator β”‚ βœ… Port 3100
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
Phase 2 (Complete - Parallel):
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ 2. Cash Out Calc β”‚ β”‚ 3. AI Value Det β”‚ β”‚ 4. Charts Backend β”‚
β”‚ βœ… Port 3101 β”‚ β”‚ βœ… Port 3102 β”‚ β”‚ βœ… Port 3104 β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
↓
Phase 3 (Complete):
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ 5. AI Chat Assistant β”‚ β”‚ 6. ML Prediction Service β”‚
β”‚ βœ… Port 3103 β”‚ β”‚ βœ… Port 3105 β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Service Status​

ServicePortStatusLocation
Price Feed Aggregator3100βœ… COMPLETEsuperskin/services/price-feed-aggregator/
Cash Out Calculator3101βœ… COMPLETEsuperskin/services/cash-out-calculator/
AI Value Detection3102βœ… COMPLETEsuperskin/services/ai-value-detection/
AI Chat Assistant3103βœ… COMPLETEsuperskin/services/ai-chat-assistant/
Trading Charts3104βœ… COMPLETEsuperskin/services/trading-charts/
ML Prediction Service3105βœ… COMPLETEsuperskin/services/ml-prediction-service/

πŸ”§ Current Infrastructure​

Deployed Components​

ComponentStatusConfiguration
Redisβœ… RunningPort 6380
TimescaleDBβœ… RunningPort 5433
PostgreSQL (Supabase)βœ… RunningIceCrystal project
Grok APIβœ… ConfiguredPrimary LLM
Groq APIβœ… ConfiguredFallback LLM

Environment Variables (Configured)​

# LLM Providers
XAI_API_KEY=... # Grok (primary)
GROQ_API_KEY=... # Groq (fallback)
OPENAI_API_KEY=... # OpenAI (backup)

# Infrastructure
REDIS_URL=redis://localhost:6380
TIMESCALE_URL=postgres://localhost:5433/superskin
SUPABASE_URL=https://hchdnajxifkorhoqqozq.supabase.co
SUPABASE_SERVICE_KEY=...

# Oracle Platform
ODDSPAPI_API_KEY=...

πŸ“ Actual File Structure​

superskin/
β”œβ”€β”€ services/
β”‚ β”œβ”€β”€ price-feed-aggregator/ # Port 3100 - TypeScript/Express
β”‚ β”‚ β”œβ”€β”€ src/
β”‚ β”‚ β”‚ β”œβ”€β”€ routes/
β”‚ β”‚ β”‚ β”œβ”€β”€ services/
β”‚ β”‚ β”‚ └── index.ts
β”‚ β”‚ └── package.json
β”‚ β”‚
β”‚ β”œβ”€β”€ cash-out-calculator/ # Port 3101 - TypeScript/Express
β”‚ β”‚ β”œβ”€β”€ src/
β”‚ β”‚ β”‚ β”œβ”€β”€ routes/
β”‚ β”‚ β”‚ β”œβ”€β”€ services/
β”‚ β”‚ β”‚ └── index.ts
β”‚ β”‚ └── package.json
β”‚ β”‚
β”‚ β”œβ”€β”€ ai-value-detection/ # Port 3102 - Python/FastAPI
β”‚ β”‚ β”œβ”€β”€ app/
β”‚ β”‚ β”‚ β”œβ”€β”€ routers/
β”‚ β”‚ β”‚ β”œβ”€β”€ services/
β”‚ β”‚ β”‚ └── main.py
β”‚ β”‚ └── requirements.txt
β”‚ β”‚
β”‚ β”œβ”€β”€ ai-chat-assistant/ # Port 3103 - Python/FastAPI
β”‚ β”‚ β”œβ”€β”€ app/
β”‚ β”‚ β”‚ β”œβ”€β”€ routers/
β”‚ β”‚ β”‚ β”œβ”€β”€ services/
β”‚ β”‚ β”‚ β”œβ”€β”€ llm/ # Grok, Groq, Ollama, OpenAI
β”‚ β”‚ β”‚ β”œβ”€β”€ tools/ # Function calling tools
β”‚ β”‚ β”‚ └── main.py
β”‚ β”‚ └── requirements.txt
β”‚ β”‚
β”‚ β”œβ”€β”€ trading-charts/ # Port 3104 - TypeScript/Express
β”‚ β”‚ β”œβ”€β”€ src/
β”‚ β”‚ β”‚ β”œβ”€β”€ routes/
β”‚ β”‚ β”‚ β”œβ”€β”€ services/
β”‚ β”‚ β”‚ └── index.ts
β”‚ β”‚ └── package.json
β”‚ β”‚
β”‚ └── ml-prediction-service/ # Port 3105 - Python/FastAPI
β”‚ β”œβ”€β”€ app/
β”‚ β”‚ β”œβ”€β”€ routers/
β”‚ β”‚ β”œβ”€β”€ services/
β”‚ β”‚ β”œβ”€β”€ models/ # ONNX model files
β”‚ β”‚ └── main.py
β”‚ └── requirements.txt
β”‚
β”œβ”€β”€ ml-training/ # Offline training pipeline
β”‚ β”œβ”€β”€ data_loader.py
β”‚ β”œβ”€β”€ feature_engineering.py
β”‚ β”œβ”€β”€ train_models.py
β”‚ └── export_onnx.py
β”‚
└── docker-compose.yml # Full stack deployment

βœ… Current Focus: ML Training Pipeline​

The 6 services are complete. Current work focuses on:

  1. Training sport-specific models for football, cricket, tennis
  2. Expanding data sources via Forsyt Data Machine
  3. Improving Grok integration with better prompts and tools
  4. Adding more sports to ML pipeline (basketball, esports)

This document reflects the current state of SuperSkin services as of January 2026.