Standard escape codes are prefixed with Escape:
- Ctrl-Key:
^[ - Octal:
\033 - Unicode:
\u001b - Hexadecimal:
\x1B - Decimal:
27
| """ | |
| The most atomic way to train and inference a GPT in pure, dependency-free Python. | |
| This file is the complete algorithm. | |
| Everything else is just efficiency. | |
| @karpathy | |
| """ | |
| import os # os.path.exists | |
| import math # math.log, math.exp |
| #!/usr/bin/env -S uv run --script | |
| # | |
| # /// script | |
| # requires-python = ">=3.13" | |
| # dependencies = [ | |
| # "pymupdf", | |
| # ] | |
| # /// | |
| import argparse |
| __wt_osc9_9 () { | |
| _win_path=$(wslpath -m $(pwd)) | |
| printf "\033]9;9;%s\033\\" "$_win_path" | |
| } | |
| [ -n "$BASH_VERSION" ] && [ -n "$WT_SESSION" ] && PROMPT_COMMAND="__wt_osc9_9" | |
| [ -n "$ZSH_VERSION" ] && [ -n "$WT_SESSION" ] && precmd_functions+=(__wt_osc9_9) | |
| true |
| Host Enumeration: | |
| --- OS Specifics --- | |
| wmic os LIST Full (* To obtain the OS Name, use the "caption" property) | |
| wmic computersystem LIST full | |
| --- Anti-Virus --- | |
| wmic /namespace:\\root\securitycenter2 path antivirusproduct |
| # Use this script to test that your Telegram bot works. | |
| # | |
| # Install the dependency | |
| # | |
| # $ gem install telegram_bot | |
| # | |
| # Run the bot | |
| # | |
| # $ ruby bot.rb | |
| # |
| import { parse } from 'cache-control-parser'; | |
| export default { | |
| async fetch(request: Request, env: {}, ctx: ExecutionContext): Promise<Response> { | |
| try { | |
| const cache = await caches.default; | |
| const cachedResponse = await cache.match(request); | |
| if (cachedResponse) { | |
| console.log('Cache: HIT'); | |
| if (shouldRevalidate(cachedResponse)) { |
As with a lot of organisations, the idea of using LLM's is a reasonably frightning concept, as people freely hand over internal IP and sensitive comms to remote entities that are heavily data bound by nature. I know it was on our minds when deciding on LLM's and their role within the team and wider company. 6 months ago, I set out to explore what offerings were like in the self-hosted and/or OSS space, and if anything could be achieved locally. After using this setup since then, and after getting a lot of questions on it, I thought I might share some of the things I've come across and getting it all setup.
Que in Ollama and Continue. Ollama is an easy way to locally download, manage and run models. Its very familiar to Docker in its usuage, and can probably be most conceptually aligned with it in how it operates, think imag
| $ echo -en "message" | openssl dgst -sha256 -hmac "key" -binary | base64 | sed -e 's/+/-/g' -e 's/\//_/g' | tr -d = | |
| bp7ym3X__Ft6uuUn1Y_a2y_kLnIZARl2kXNDBl9Y7Uo |
| local luaversion = function() | |
| if ({false, [1] = true})[1] then -- luacheck: ignore 314 | |
| return 'LuaJIT' | |
| elseif 1 / 0 == 1 / '-0' then | |
| return 0 + '0' .. '' == '0' and 'Lua 5.4' or 'Lua 5.3' | |
| end | |
| local f = function() return function() end end | |
| return f() == f() and 'Lua 5.2' or 'Lua 5.1' | |
| end |