https://x.com/VictorTaelin/status/2001777678765129832
* ▐▛███▜▌ * Claude Code v2.0.72
* ▝▜█████▛▘ * Opus 4.5 · Claude Max
* ▘▘ ▝▝ * ~/vic/dev/hvm4-claude
> # Task 1: print unscoped lambdas and floating dups with matching names
https://x.com/VictorTaelin/status/2001777678765129832
* ▐▛███▜▌ * Claude Code v2.0.72
* ▝▜█████▛▘ * Opus 4.5 · Claude Max
* ▘▘ ▝▝ * ~/vic/dev/hvm4-claude
> # Task 1: print unscoped lambdas and floating dups with matching names
Relax, I only have one Sunday to work on idea, literally my weekend project. So I tried Deepseek to see if it can help. Surprisingly, it works and it saves me another weekend...
Just chat.deepseek.com (cost = free) with prompts adapted from this gist.
| [ | |
| "fa-classic fa-solid fa-0 fa-fw", | |
| "fa-classic fa-solid fa-1 fa-fw", | |
| "fa-classic fa-solid fa-2 fa-fw", | |
| "fa-classic fa-solid fa-3 fa-fw", | |
| "fa-classic fa-solid fa-4 fa-fw", | |
| "fa-brands fa-42-group fa-fw", | |
| "fa-classic fa-solid fa-5 fa-fw", | |
| "fa-brands fa-500px fa-fw", | |
| "fa-classic fa-solid fa-6 fa-fw", |
| # /// script | |
| # requires-python = ">=3.12" | |
| # dependencies = [ | |
| # "llm", | |
| # "textual", | |
| # ] | |
| # /// | |
| from textual import on, work | |
| from textual.app import App, ComposeResult | |
| from textual.widgets import Header, Input, Footer, Markdown |
| import json | |
| import logging | |
| import re | |
| from typing import Any, AsyncGenerator, Optional, Union | |
| import aiohttp | |
| import openai | |
| from azure.search.documents.aio import SearchClient | |
| from azure.search.documents.models import QueryType |
Add it to the body
<body hx-ext="hx-astro-view-transition">An example
Apologies for the snarky title, but there has been a huge amount of discussion around so called "Prompt Engineering" these past few months on all kinds of platforms. Much of it is coming from individuals who are peddling around an awful lot of "Prompting" and very little "Engineering".
Most of these discussions are little more than users finding that writing more creative and complicated prompts can help them solve a task that a more simple prompt was unable to help with. I claim this is not Prompt Engineering. This is not to say that crafting good prompts is not a difficult task, but it does not involve doing any kind of sophisticated modifications to general "template" of a prompt.
Others, who I think do deserve to call themselves "Prompt Engineers" (and an awful lot more than that), have been writing about and utilizing the rich new eco-system
ChatGPT appeared like an explosion on all my social media timelines in early December 2022. While I keep up with machine learning as an industry, I wasn't focused so much on this particular corner, and all the screenshots seemed like they came out of nowhere. What was this model? How did the chat prompting work? What was the context of OpenAI doing this work and collecting my prompts for training data?
I decided to do a quick investigation. Here's all the information I've found so far. I'm aggregating and synthesizing it as I go, so it's currently changing pretty frequently.
| #!/bin/bash | |
| ### | |
| ### my-script — does one thing well | |
| ### | |
| ### Usage: | |
| ### my-script <input> <output> | |
| ### | |
| ### Options: | |
| ### <input> Input file to read. | |
| ### <output> Output file to write. Use '-' for stdout. |