RogueX
Conversational Crypto Intelligence Agent
ABOUT THE PROJECT
Most crypto tools make you do the thinking yourself. You check a price on one tab, pull up a chart on another, read someone's analysis on a third, and then try to synthesize all of it into a decision. The information exists, it is just scattered and presented as raw data rather than something actionable. I built RogueX as an experiment to see if a single conversational interface could collapse that whole process into one place.
The system takes a natural language query, figures out what you are actually asking for, and routes it to the right handler. Asking for a price check goes through a fast path with no LLM involved. Asking for a technical breakdown routes to an analysis handler that gets the live market data as context before the LLM ever generates a response. Asking for a trading signal goes to a separate handler with a different system prompt tuned specifically for directional calls. The routing happens automatically based on how the query is parsed, so you just ask the question and get the right type of answer back.
The interesting engineering problem was the fallback chain. Both the analysis and signal handlers run Gemini as the primary model with Groq as the fallback. Rather than building retry logic, having a second provider in the chain means the workflow never surfaces an error to the user when one provider is rate-limited or slow. The queue delay nodes before each LLM call were added for the same reason: without them, a burst of simultaneous queries would hit the provider rate limits immediately.
The other thing worth noting is the session context restoration step. Before any routing happens, the system pulls the metadata from the previous conversation so follow-up queries carry forward the coin and timeframe from earlier in the session. This was a small addition but it made the experience feel much more like talking to an analyst than filling out a form.
The stack is n8n for orchestration, Supabase for session storage, Gemini and Groq as the LLM providers, and a custom frontend built in HTML, CSS, and JavaScript.