Skip to content

Instantly share code, notes, and snippets.

View vmadala2020's full-sized avatar

Venkat Madala vmadala2020

View GitHub Profile
@championswimmer
championswimmer / how-ai-agents-are-made.md
Last active February 24, 2026 15:25
How Personal AI Agents and Agent Orchestrators like OpenClaw or GasTown are Made

How Personal AI Agents and Agent Orchestrators like OpenClaw or GasTown are Made

img-01

Over the last few months, projects like Gas Town by Steve Yegge and OpenClaw by Peter Steinberger have made “AI agent orchestrators” feel suddenly mainstream. It is tempting to treat them as a new kind of intelligence, but under the hood they are still a small set of primitives wired together with discipline: an LLM API call, a state loop, tools, memory, and orchestration.

This raises a practical question: what is actually inside an “agent,” and how is it different from ChatGPT (a chat UI over a model) or coding tools like Claude Code (an agentic coding surface)? Gas Town’s README frames it as a “multi‑agent orchest

@karpathy
karpathy / microgpt.py
Last active February 24, 2026 23:22
microgpt
"""
The most atomic way to train and run inference for a GPT in pure, dependency-free Python.
This file is the complete algorithm.
Everything else is just efficiency.
@karpathy
"""
import os # os.path.exists
import math # math.log, math.exp
@LaurenceJJones
LaurenceJJones / init-script.sh
Last active January 16, 2025 12:06
crowdsec init script
#!/bin/bash
##########
## Maintained by Laurence from CrowdSec
## Discord: https://discord.gg/crowdsec
## Website: https://www.crowdsec.net/
## Docs: https://docs.crowdsec.net/
##########
# Linode users can use the UI to change these variables
# Digital ocean users uncomment and change these variables
@rain-1
rain-1 / LLM.md
Last active February 24, 2026 02:03
LLM Introduction: Learn Language Models

Purpose

Bootstrap knowledge of LLMs ASAP. With a bias/focus to GPT.

Avoid being a link dump. Try to provide only valuable well tuned information.

Prelude

Neural network links before starting with transformers.