You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
Instantly share code, notes, and snippets.
🇨🇭
Swiss Made
Niko
nikooo777
🇨🇭
Swiss Made
Graduated as IT Engineer (software developer) in Switzerland.
Programming is my way of doing "magic" in real life.
Resolving MySQL Server Crash During Large Table Optimization
When optimizing a large MySQL table (e.g., ~300GB), the database server might hang during the final step of swapping the old table with the optimized one. If this hang persists beyond 600 seconds, MySQL intentionally crashes to prevent prolonged instability.
Symptoms
Typical log entries indicating the issue:
---TRANSACTION 60682931334, ACTIVE 11095 sec dropping table
Efficiently Exporting and Importing Partial Tables from Percona MySQL 8 for Local Development
This tutorial explains how to efficiently export selected large tables from a production Percona MySQL 8 database to a local development environment. This approach avoids the significant performance penalty associated with logical backups (mysqldump) when working with large tables (multiple GBs each).
Prerequisites:
Your MySQL (Percona MySQL 8 recommended) server must have the innodb_file_per_table setting enabled. This setting is enabled by default in recent installations.
The documentation is very lacking (probably because this is new?) and it took me some extra time to figure out where to get the tokens and how to configure the importer.
gh actions-importer configure
When you run this step gh actions-importer configure you need to provide personal access tokens for both github and the platform you're trying to migrate from.
In my case it's Travis CI and that's what I will cover here, however the steps to acquire the personal access token for github are the same.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
S3 cache layer using MinIO Gateway and docker-compose
Intro
Whether you're looking to save money on egress charges or whether you simply want a local cache of a subset of your S3 bucket, all you need is this docker compose configuration to proxy all your read requests and cache files locally.
Given the large data breach uncovered by HIBP (https://www.troyhunt.com/the-773-million-record-collection-1-data-reach/) in which I was listed I wrote a script to check all my passwords.
Here are the instructions if you want to check all your passwords against their database in a safe way:
export all your chrome passwords (or from whatever service you use)
put all the passwords in a file so that you have a password for each line and nothing else
generate a hashtable (sha1) for each line for i in $(cat sortedpasswords.txt); do echo $(echo -n $i | sha1sum | awk '{print $1}') "--- $i" >> hashtable.txt; done
While upgrading the infrastructure at LBRY, the company I work for, I had the necessity of changing certbot renewals from HTTP auth to DNS auth.
A quick google search didn't bring any results for my simple question "How to convert a Certbot certificate configuration based on HTTP authentication to DNS based authentication" (hint: that's too long to be used as search key!) so I thought I'd write up this simple guide.
This article will walk you through the simple steps of porting your old certificates from HTTP authentication to DNS authentication.