Claude Code 2.1.27 occasionally drops long-running chats from /resume even though
the per-session .jsonl files still exist. This script rescans a given ~/.claude/ projects/<project> directory, re-derives every entry’s metadata, and rewrites
sessions-index.json so /resume can list every session again.
| # ============================================================================ | |
| # WorkSpaces Monitoring Bootstrap — UAC Elevation Wrapper | |
| # ============================================================================ | |
| # | |
| # This script solves the UAC split token problem for AD logon scripts. | |
| # | |
| # THE PROBLEM: | |
| # Even when a WorkSpaces user is in the Domain Admins group, Windows gives | |
| # them a "filtered" (non-elevated) token at logon. Installing services, | |
| # writing to Program Files, and registering SSM agents all need an |
| # simplead-user-terraform.tf | |
| # | |
| # Automated SimpleAD User Creation for Amazon WorkSpaces via Terraform | |
| # ==================================================================== | |
| # | |
| # PROBLEM: | |
| # AWS SimpleAD has NO API for creating directory users. The aws_directory_service | |
| # resource can create the directory itself, and aws_workspaces_workspace can | |
| # create a WorkSpace -- but there is no Terraform resource or AWS API call to | |
| # create the AD user that sits between them. |
| node { | |
| echo 'Results included as an inline comment exactly how they are returned as of Jenkins 2.121, with $BUILD_NUMBER = 1' | |
| echo 'No quotes, pipeline command in single quotes' | |
| sh 'echo $BUILD_NUMBER' // 1 | |
| echo 'Double quotes are silently dropped' | |
| sh 'echo "$BUILD_NUMBER"' // 1 | |
| echo 'Even escaped with a single backslash they are dropped' | |
| sh 'echo \"$BUILD_NUMBER\"' // 1 | |
| echo 'Using two backslashes, the quotes are preserved' | |
| sh 'echo \\"$BUILD_NUMBER\\"' // "1" |
| node { | |
| echo 'Results included as an inline comment exactly how they are returned as of Jenkins 2.121, with $BUILD_NUMBER = 1' | |
| echo 'No quotes, pipeline command in single quotes' | |
| sh 'echo $BUILD_NUMBER' // 1 | |
| echo 'Double quotes are silently dropped' | |
| sh 'echo "$BUILD_NUMBER"' // 1 | |
| echo 'Even escaped with a single backslash they are dropped' | |
| sh 'echo \"$BUILD_NUMBER\"' // 1 | |
| echo 'Using two backslashes, the quotes are preserved' | |
| sh 'echo \\"$BUILD_NUMBER\\"' // "1" |
| #!/bin/bash -xe | |
| ## Create a VPC Peering connection between EKS and RDS Postgres | |
| echo """ run this script as: | |
| ./eks-rds-peering.sh | |
| + read -p 'Enter name of EKS Cluster: ' EKS_CLUSTER | |
| Enter name of EKS Cluster: xolv-dev-cluster | |
| + EKS_VPC=eksctl-xolv-dev-cluster-cluster/VPC | |
| + EKS_PUBLIC_ROUTING_TABLE=eksctl-xolv-dev-cluster-cluster/PublicRouteTable | |
| + read -p 'Enter name of RDS: ' RDS_DB_NAME | |
| Enter name of RDS: sfstackuat |
try sqlite3 for csv wrangling! it's a powerhouse for easily turning your csv files into a queryable in-memory database, and it's trivial to turn that into json, and to pipe a sql query's output into jq or into a .db file, whatever you want. here are some aliases I wrote, put them in your dotfiles somewhere:
alias csvq="sqlite3 :memory: -cmd '.mode csv' -cmd '.import /dev/stdin s3' '.mode json'"
You can see that I generically name the table "s3", cuz it's an alias I use with any csv file I'm streaming from an s3 bucket, so the table name doesn't need to be too specific. I use it like this:
aws s3 cp s3://$bucket/$key - | csvq "select * from s3" | jq '.[]' -c
"select * from table" means "give me a bunch of rows", so jq '.[]' -c turns those rows into pretty-printed compressed jsonlines. I also have an alias tsvq for tsv files. It's the same thing pretty much, except with .mode tabs instead of .mode csv
alias tsvq="sqlite3 :memory: -cmd '.mode tabs' -cmd '.import /dev/stdin s3' '.mode json'"
| https://giphy.com/gifs/roosterteeth-lol-barbara-dunkelman-rt-podcast-lPMBfPCJnhX9kwXxc8 |
| #!/bin/bash | |
| set -eu | |
| NAMESPACE=$1 | |
| PVCNAME=$2 | |
| TARGETZONE=$3 | |
| DEPLOYMENTOBJ=$4 | |
| PVNAME=$(oc -n $NAMESPACE get pvc $PVCNAME --template={{.spec.volumeName}}) | |
| VOLUMEID=$(oc -n $NAMESPACE get pv $PVNAME --template={{.spec.awsElasticBlockStore.volumeID}} | cut -d/ -f 4) |
| #!/usr/bin/env bash | |
| # the `update-function-configuration` overwrites the existing set envars. | |
| # In order to *ADD* variables we need to read the existing envars and add to that. | |
| # This command uses `jq` to read and transform the json result to an envar then update the lambda configuration | |
| # create the updated envar set | |
| FUNCTION_NAME=trip-analytics-prod-app | |
| UPDATED=.env.prod | |
| UPDATE=$(jq --raw-input --slurp 'split("\n")[:-1]|map(split("=") as [$key, $value] | {$key, $value}) | from_entries' $UPDATED) |