- Tech Stacks
- Posts
- How self.so Was Built in 30 Days: A Fast, Free Personal Site Generator
How self.so Was Built in 30 Days: A Fast, Free Personal Site Generator
Keep it lean. Keep it clean.
✨Hey there, tech stackers! 🚀
🧠 Stack Spotlight: How Hassan Built self.so in Record Time
We love seeing devs ship fast—and few do it better than Hassan El Mghari (@nutlope). His latest project, self.so, lets you turn your resume or LinkedIn profile into a personal website in under 30 seconds.
Don’t worry if you don’t know what Redis or LLMs are—this tech stack just shows how one person used smart tools to build a beautiful, free personal website generator in 30 days. You don’t need to code to appreciate great design, fast execution, and ideas brought to life.
It’s clean. It’s smart. It’s open source. And best of all—it’s free.
Naturally, we had to dig into the tech that made it happen. Here’s the stack Hassan used to bring self.so to life 👇
🧱 Framework: Next.js
As usual, Hassan keeps it modern with Next.js. It’s fast, flexible, and perfect for building and shipping React apps at speed.
Why it works:
Server and client rendering
File-based routing
Built-in API routes
Perfect for a site that generates personalized pages on the fly.
☁ Hosting: Vercel
Built on Next.js? You’re probably deploying on Vercel—and that’s exactly what Hassan did.
Why Vercel fits:
One-click deploys
Instant rollbacks
Great DX for full-stack apps
Push to GitHub and you’re live in seconds.
🧠 LLM: Together Compute
To power the resume-to-site transformation, Together Compute provides the large language model (LLM) under the hood.
Why it’s interesting:
Open weights + high performance
Cost-effective vs big-name APIs
Ideal for open-source LLM use cases
It’s what gives self.so its smarts.
⚙️ LLM Framework: aiSDK
To make the LLM usable within the frontend, Hassan integrates aiSDK—a lightweight library for connecting apps to LLMs.
Why it’s useful:
Fast client-side inference
Framework-agnostic
Works with Together, OpenAI, Claude, and more
Quick setup, smart results.
🔐 Authentication: Clerk
You don’t need to roll your own auth anymore. Hassan went with Clerk to handle sign-ins and user sessions.
What it brings:
Prebuilt UI components
Social + passwordless logins
Next.js support out of the box
Authentication in minutes, not days.
📊 Observability: Helicone
Debugging and monitoring LLM calls? Hassan used Helicone to track LLM usage and latency.
Why it’s helpful:
Real-time LLM observability
Tracks prompt/response performance
Works with multiple providers
Better visibility = faster debugging and better UX.
📦 File Storage: Amazon S3
For uploading resumes and storing PDFs, Hassan used Amazon S3.
Why it makes sense:
Scalable object storage
Cheap and reliable
Battle-tested
It’s the cloud staple for a reason.
🧠 Database: Upstash Redis
self.so keeps it fast and simple with Upstash Redis as the main data layer.
Why it fits:
Serverless Redis with global distribution
Built for Next.js
Great for caching and session data
When milliseconds matter, Redis delivers.
🧰 TL;DR – self.so Tech Stack
Function | Tool |
---|---|
Framework | Next.js |
Hosting | Vercel |
LLM | Together Compute |
LLM Framework | aiSDK |
Authentication | Clerk |
Observability | Helicone |
File Storage | Amazon S3 |
Database | Upstash Redis |
Hassan continues to ship clean, useful products at an inspiring pace—and self.so is no exception.
Try it for yourself at self.so, or check out the GitHub repo if you want to tinker.
Built in public. Shared with the world. Just how we like it. 🚀
Always remember,
Explore broadly
Not just SaaS or tech, but multiple disciplines.
Share ideas selflessly
Be willing to share your knowledge. Keep giving and it will come back to you.
Your stack is not going to save you.
You are. So just build.
Go crush it,
Mark
Tech Stacks is human-compiled, edited, and designed
Reply