Skip to content

Instantly share code, notes, and snippets.

@lucataco
lucataco / SKILL.md
Created February 7, 2026 20:11
openclaw-introspect skill
name description
openclaw-introspect
Explore, understand, and reconfigure your own OpenClaw gateway, agent harness, and system prompt. Use when you need to inspect or change OpenClaw configuration (openclaw.json), understand how the system prompt is built, debug session/channel/model issues, navigate the docs or source code, or tune agent defaults (models, thinking, sandbox, tools, heartbeat, compaction, channels, skills, plugins, cron, hooks). Also use for questions about OpenClaw architecture, the agent loop, context window, or how any OpenClaw feature works internally.

OpenClaw Self-Introspection

Explore and reconfigure your own harness. This skill gives you structured knowledge about the OpenClaw internals so you can inspect, debug, and tune the running gateway.

Quick commands

@lucataco
lucataco / SKILL.md
Last active February 3, 2026 02:20
Codex Replicate Skill
name description
replicate-mcp
Configure and validate Replicate MCP connectivity in Codex using REPLICATE_API_TOKEN and the official replicate-mcp server package. Use when setting up Replicate MCP for the first time, reconnecting after auth/config changes, or troubleshooting missing Replicate MCP tools.

Replicate MCP

Use this skill to set up and verify Replicate MCP access in Codex with minimal back-and-forth.

Workflow

@lucataco
lucataco / launchable.sh
Created January 8, 2026 06:29
brev.dev
#!/bin/bash
# Update
sudo apt-get update
# Install tools
sudo apt install nvidia-cuda-toolkit
# Install cog
sudo curl -o /usr/local/bin/cog -L "https://github.com/replicate/cog/releases/latest/download/cog_$(uname -s)_$(uname -m)"
sudo chmod +x /usr/local/bin/cog
@lucataco
lucataco / Cloudflare
Created December 16, 2025 04:02
Ghostty Cloudflare theme
palette = 0=#23272e
palette = 1=#f38020
palette = 2=#a8e6a3
palette = 3=#faae40
palette = 4=#4da6ff
palette = 5=#ff80ab
palette = 6=#66d9ef
palette = 7=#c0c5ce
palette = 8=#4f5b66
palette = 9=#f38020
@lucataco
lucataco / daemon.json
Created October 19, 2025 18:02
docker fix on Brev.dev Crusoe GPUs to work with Replicate models
# Replace /etc/docker/daemon.json docker config in Brev.dev Crusoe GPUs
{
"default-runtime": "nvidia",
"mtu": 1500,
"runtimes": {
"nvidia": {
"args": [],
"path": "nvidia-container-runtime"
}
},
@lucataco
lucataco / kokoro
Created October 9, 2025 23:23
local CLI tool to run Kokoro TTS on Apple silicon (MBP)
#!/bin/bash
# This is local cli command that allows users to use kokoro on a Macbook Pro
# Requires you to first run the kokoro docker container:
docker run -p 8880:8880 ghcr.io/remsky/kokoro-fastapi-cpu:latest
# Then save this file to /usr/local/bin
# Finally you can test:
kokoro "The quick brown fox jumped over the lazy dog"
# Or even pipe from a stream like:
llm "tell me a joke" | kokoro
@lucataco
lucataco / hf.py
Last active March 5, 2025 18:41
Run Wan2.1-T2V-1.3B-Diffusers on your Mac
# Setup:
# conda create -n wan python=3.10
# conda activate wan
# pip3 install torch torchvision torchaudio
# pip install git+https://github.com/huggingface/diffusers.git@3ee899fa0c0a443db371848a87582b2e2295852d
# pip install accelerate==1.4.0
# pip install transformers==4.49.0
# pip install ftfy==6.3.1
@lucataco
lucataco / docker-compose.yaml
Created February 8, 2025 02:59
Coolify Pihole & Unbound script
services:
pihole-unbound:
image: 'bigbeartechworld/big-bear-pihole-unbound:2024.07.0'
environment:
- SERVICE_FQDN_PIHOLE_8080
- SERVICE_FQDN_PIHOLE_10443
- 'DNS1=127.0.0.1#5353'
- DNS2=no
- TZ=America/Chicago
- WEBPASSWORD=$SERVICE_PASSWORD_PIHOLE
@lucataco
lucataco / run.py
Last active August 6, 2024 20:56
Flux-Schnell Optimum Quanto
from optimum.quanto import freeze, qfloat8, quantize
from diffusers import FluxPipeline
import torch
import time
seed=1337
generator = torch.Generator("cuda").manual_seed(seed)
pipeline = FluxPipeline.from_pretrained("black-forest-labs/FLUX.1-schnell", torch_dtype=torch.bfloat16).to("cuda")
@lucataco
lucataco / predict.py
Last active January 31, 2025 20:28
Flux Schnell locally on MPS
# conda create -n flux python=3.11
# conda activate flux
# pip install torch==2.3.1
# pip install diffusers==0.30.0 transformers==4.43.3
# pip install sentencepiece==0.2.0 accelerate==0.33.0 protobuf==5.27.3
import torch
from diffusers import FluxPipeline
import diffusers