| description | arguments | |||||||
|---|---|---|---|---|---|---|---|---|
Comprehensive PR code review inspired by GitHub Copilot - analyzes code quality, security, performance, and best practices |
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Here is a detailed prompt you can use with an AI coding assistant (like Cursor, GitHub Copilot, or ChatGPT) to generate your full project scaffolding. | |
| It is designed to force the AI to separate your business logic (the pipeline) from your API logic (FastAPI), which is critical for testing. | |
| The Prompt to Copy & Paste | |
| > I am refactoring a Python data pipeline script into a production-ready FastAPI service. I have already installed fastapi, pydantic, instructor, and openai. | |
| > The Goal: | |
| > Create a strictly typed, async API service that processes messages. The pipeline flow is: Input (SQS style payload) -> Pre-processing -> Azure OpenAI Analysis (via Instructor) -> Database Write (Mocked for now). | |
| > Please generate the code for the following project structure: | |
| > * app/core/config.py: | |
| > * Use pydantic-settings to manage environment variables (Azure Endpoint, API Key, Deployment Name). | |
| > * app/models/schemas.py: |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Yes, absolutely—you can get full tracing, spans, and rich data in Datadog even if everything currently just logs → CloudWatch → Datadog. | |
| But there’s an important distinction: | |
| • What you have now: | |
| • Logs from Lambda and Step Functions go to CloudWatch, then to Datadog Logs. | |
| • This is logging only (unless you embed trace IDs yourself). | |
| • What you want: | |
| • APM tracing + spans (end-to-end per transcript / per Lambda execution). | |
| • Log ↔ trace correlation, service maps, latency breakdown, etc. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Short version: use TanStack Start for the full‑stack shell + server functions, TanStack Query for talking to your existing DB APIs, and TanStack DB as your in‑browser “mini database” that keeps everything normalized, reactive, and crazy fast. | |
| I’ll break it down into: | |
| 1. Mental model: what each piece should do | |
| 2. Suggested architecture / folder structure | |
| 3. How to wire TanStack Start ⇄ DB (with stored procedures) | |
| 4. How to wire TanStack Query + TanStack DB | |
| 5. How this helps for your call transcript grading domain | |
| 6. Specific tricks to make the app feel “instant” |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Great—let’s switch to moment and update the rules exactly as you described. | |
| Updated Rules (with moment) | |
| • If neither startDate nor endDate is provided → window is the last N days ending today (inclusive), where N = defaultDays (default 30). | |
| • If only startDate is provided → endDate defaults to today (inclusive). | |
| • If only endDate is provided → startDate falls back to endDate - (defaultDays - 1) (inclusive window), with defaultDays coming from the factory options (defaults to 30). | |
| • If both are provided → validate and normalize. | |
| • Normalized output is UTC YYYY-MM-DD (date-only). | |
| • All invalid inputs throw BadRequestException → maps to 400, never 500. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Absolutely—there’s a clean, idiomatic way to (a) **re‑use start/end date validation across many DTOs** and (b) **run unit + e2e in one Jest invocation with a single coverage report**. | |
| --- | |
| ## A. Re‑usable date‑range validation for query DTOs | |
| **Goal:** many endpoints accept `start` / `end` query params; we want: | |
| * parse strings → `Date` | |
| * validate each value |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Below is a **repo‑level structure** that makes GitHub Copilot automatically pick up the right guidance for each part of your monorepo (APIs with NestJS and UIs with Angular), using only what GitHub documents and supports today. | |
| --- | |
| ## What to create (folder & file layout) | |
| ``` | |
| your-monorepo/ | |
| ├─ APIs/ | |
| │ └─ ... (NestJS apps & libs) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| RFC: Implementing Circuit Breaker Functionality in Backstage Backend Using Opossum | |
| Metadata | |
| • RFC ID: [Internal - Circuit Breaker Integration] | |
| • Authors: Grok (based on user query) | |
| • Status: Draft Proposal | |
| • Date: September 29, 2025 | |
| • Version: 1.0 | |
| Abstract | |
| This RFC proposes the integration of circuit breaker patterns into the Backstage backend to enhance resilience against failures in API endpoints. We evaluate two options: a global circuit breaker applied to the entire backend and a per-plugin circuit breaker that can be optionally added to individual plugins. We recommend the per-plugin approach for better isolation and extensibility, avoiding modifications to core Backstage components. As an example, we demonstrate how to apply this to the Catalog API plugin. | |
| Introduction |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| networks: | |
| app_network: | |
| driver: bridge | |
| services: | |
| db: | |
| image: docker.io/postgres:latest | |
| ports: | |
| - 5433:5432 | |
| command: ['postgres', '-c', 'log_statement=all'] |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Plan for Implementing Circuit Breaker Functionality in Backstage Backend Using Opossum | |
| Overview | |
| The goal is to add circuit breaker protection to each backend plugin in Backstage using the Opossum library (assuming “opaussum” is a typo for “opossum”). This will ensure that if a plugin’s API endpoints experience repeated failures (e.g., internal errors leading to 500+ status codes), the circuit opens, and subsequent requests to that plugin are rejected with a 503 (Service Unavailable) response to prevent cascading failures. When the circuit half-opens and succeeds, it closes again. | |
| Key requirements: | |
| • Per-plugin circuits: Each backend plugin operates on its own independent circuit breaker instance. | |
| • Automatic configuration: The implementation uses Backstage’s service override mechanism, so no changes are needed in individual plugins. New plugins added via backend.add(...) will automatically inherit the circuit breaker without any additional code. | |
| This approach treats the circuit breaker as a server-side mech |
NewerOlder