Posts

This Week I Learned - Week 19 2026

Image
This Week I Learned -  * Alphabet, Amazon, Meta, and Microsoft all reported impressive quarterly earnings on Wednesday, surpassing Wall Street expectations thanks to booming demand for AI and cloud services. AI continues to be the main growth engine, with the four companies projected to invest around $650 billion in AI infrastructure this year. * SpaceX & XAI will provide Anthropic with access to Memphis-based Colossus 1, one of the world’s largest and fastest-deployed AI supercomputers, to provide additional capacity for Claude. The supercomputer is powered by 220,000+ NVIDIA GPUs. * Palo Alto Networks, a US-based cybersecurity firm, is set to acquire AI infrastructure startup Portkey to strengthen its AI security stack for enterprise use of autonomous agents, at twice its estimated $60–70 million valuation. * Portkey’s AI gateway sits between applications and large language models, helping companies monitor, manage and secure AI traffic. Palo Alto Networks plans to fold...

GitHub’s Growing Pains

Image
GitHub has experienced explosive user growth in recent years, particularly accelerating in 2025. The platform crossed 100 million developers in 2023 and reached over 180 million developers by late 2025, with more than 36 million new developers joining in 2025 alone—equating to roughly one new developer every second. This rapid expansion, further amplified by AI-powered tools like GitHub Copilot, has driven significant increases in activity: hundreds of millions of pull requests, nearly 1 billion commits in 2025 (up 25% YoY), and surging repository creation and API usage. Impact on Availability This hyper-growth has strained GitHub’s infrastructure, leading to multiple high-impact outages and degraded performance in early-to-mid 2026. GitHub’s own engineering leadership has publicly acknowledged that rapid load growth, architectural coupling between services, and challenges in handling large-scale workloads (including monorepos and AI-driven automation) have caused cascading issues. A ...

HOW TO Optimize Token Usage in GitHub Copilot

Image
GitHub Copilot is moving to usage‑based billing on June 1, 2026, where token consumption (input, output, cached) directly affects costs. To optimize tokens, you need to reduce unnecessary context, use caching, and structure prompts efficiently. 1 2 3 4 5 How Copilot Uses Tokens Input tokens → what you send (code, prompts, context). Output tokens → what Copilot generates. Cached tokens → reused context (cheaper than new input). Context loading (files, repo, history) often consumes 80–90% of tokens, not the generated code itself. 5 Practical Strategies to Optimize Token Usage 1. Control Context Aggressively Avoid opening large/unrelated files while prompting. Limit selection scope before asking Copilot. Exclude build, log, and generated files at the enterprise level (e.g., /target/**, *.class, *.xml). 4 2. Break Tasks into Micro‑Operations ❌ Bad: “Refactor entire microservice.” ✅ Better: “Refactor this method to use reactive pattern.” Smaller scope = fewer files scanned = fewer tok...