Real-time Github Analytics with ClickHouse, Redpanda
I wanted to find a cool example to build a real-time analytical backend. One of my friends at a venture firm had created a real-time GitHub analytics tool their investors use to source potential open-source startup investment opportunities. I decided to whip up something similar to show how Moose can help you build this in a matter of minutes.
Overview of what I did
- Ingest GitHub events in real time through Redpanda streams
- Transform and enrich data with TypeScript
- Store data in ClickHouse tables
- Expose a fast, parameterized API for your frontend
- Generate a type-safe TypeScript SDK using OpenAPI Generator CLI
- Deploy everything to production with Boreal (backend) and Vercel (frontend)
Deployment steps are not in this post but are shown in the video tutorial starting at 9:27.
Preview the live dashboard here
Setup & Data Model
Let's start with the data model. Moose uses TypeScript interfaces to define the shape of your data. Here's how we model GitHub events and enriched repo star events:
We just use TypeScript's inheritance to build our data model. The IRepoStarEvent extends IGhEvent, adding all the rich metadata we want to track about starred repositories. This pattern makes it super easy to enrich data as it flows through your pipeline—you just extend the base interface with new fields.
Ingesting Data
With these data models in place, I can declare my pipelines to support the ingest and transformation of this data. Here's how it looks:
Three lines, three systems wired together. IngestPipeline creates the ingest endpoint, stream, and table from the same type.
Processing / Transformation
See that function transformGHEvent that we added to our stream with addTransform? Let's take a look at that. Transformations are just TypeScript functions that take an event and returns a new event with enriched data on the fly:
The transformer is plain TypeScript: pull repo metadata with Octokit, merge it into the event, return the result. Moose handles retries, back-pressure, and streaming semantics for you; you just focus on mapping input to output.
Exposing Results
We use ConsumptionAPI to expose the transformed data as a parameterized API. Here's the handler for the trending topics timeseries endpoint:
A normal function, but Moose turns it into a REST endpoint and OpenAPI spec automatically. sql embeds ClickHouse SQL with TypeScript types, so your IDE can autocomplete columns and table names while you write queries.
Type-Safe Frontend with OpenAPI Generator CLI
Here's where things get fun: Moose automatically generates an OpenAPI spec for your Ingest and Consumption APIs. I use the OpenAPI Generator CLI to turn that spec into a TypeScript fetch SDK. This SDK is imported directly into the Next.js frontend, so every API call is fully type-checked—no more guessing at request or response shapes.
To generate the SDK, just run:
Now, in the frontend, you can call your backend API like this:
And you get full type safety on both the request and the response, straight from your API models.
A sample JSON response:
This lets the frontend render a real-time chart of trending topics, filtered and grouped by any interval, with zero risk of type mismatches.
Wrapping up
- Build a real-time analytics dashboard in < 100 lines
- Get type safety and infra automation out of the box
- Focus on business logic, not boilerplate
- Deploy to production in minutes
- Enjoy type-safe API calls from backend to frontend
Build it yourself
Source code is on GitHub: 514-labs/moose/tree/main/templates/github-dev-trends.
-
Use the Moose CLI to download the project:
moose init your-app-name github-dev-trends -
Run it:
moose dev -
⭐️ Consider giving us a star
-
Join our Moose Community Slack to share what you build or ask questions!
Interested in learning more?
Sign up for our newsletter — we only send one when we have something actually worth saying.
Related posts
All Blog Posts
Templates, Product, AI
Goodreads Book Review Dataset Template
The blog post discusses how the author created a template to ingest Goodreads data from Kaggle to demonstrate the capabilities of their advanced AI tools, including data ingestion and analysis. It provides a detailed guide on how to set up the tools, ingest the dataset, explore the data with AI, and even productionize results .

Educational, Data Products, AI, Templates
Aircraft Transponder (ADS-B) Data Template and Aurora MCP Analytics Engineer
This blog walks through setting up a Moose project to track military aircraft data using a Moose / Aurora ADS-B template, with a focus on leveraging Aurora MCP tools for data exploration and productionizing APIs. It provides a step-by-step guide to exploring and analyzing transponder data, using Claude Desktop and Cursor for insightful queries and visualizations.