ABC (@ubunta) 's Twitter Profile
ABC

@ubunta

Data & ML Infrastructure for Healthcare
Opinions are पड़ोसी' | DhanvantriAI | HotTechStack
📍
🇩🇪Berlin & 🇮🇳Kolkata

ID: 69009666

linkhttps://www.abhishekchoudhary.net calendar_today26-08-2009 15:23:47

5,5K Tweet

4,4K Followers

3,3K Following

ABC (@ubunta) 's Twitter Profile Photo

Launching ChatwithDataBase 🚀- A platform that lets you chat directly with your databases and data warehouses. 📌 Link: chatwithdb.hottechstack.com I believe chat will become the main way we interact with data. What it does: - Works with Snowflake and PostgreSQL databases through

Launching ChatwithDataBase 🚀- A platform that lets you chat directly with your databases and data warehouses. 
📌 Link: chatwithdb.hottechstack.com

I believe chat will become the main way we interact with data.

What it does:
- Works with Snowflake and PostgreSQL databases through
ABC (@ubunta) 's Twitter Profile Photo

Building an MCP Server for multiple databases is complex and presents several technical challenges that need careful solutions. - Session state storage is crucial but determining where to store session data securely is a key design decision. The storage location must balance

ABC (@ubunta) 's Twitter Profile Photo

Over the past few days I’ve spent a lot of time with Gemini 2.5 (both Pro and Flash) and Sonnet 4. Here’s how they stack up: - Gemini is quick and cranks out hefty codebases, but it loses context faster than Sonnet. - Gemini Flash is the better choice for Node.js, while

ABC (@ubunta) 's Twitter Profile Photo

So far, 126 members have accessed HOTTECHSTACK ChatwithDatabase. Here’s a summary of the feedback received and the proposed next steps. Database Performance: - Snowflake MCP Server significantly outperformed PostgreSQL, likely due to better compatibility with LLM-generated SQL

ABC (@ubunta) 's Twitter Profile Photo

I believe we're entering an era where interacting with data warehouses through natural language will transform how businesses approach analytics. This shift makes data accessible to everyone, not just technical experts. - Anyone can ask questions like "What drove our sales

I believe we're entering an era where interacting with data warehouses through natural language will transform how businesses approach analytics. This shift makes data accessible to everyone, not just technical experts.

- Anyone can ask questions like "What drove our sales
ABC (@ubunta) 's Twitter Profile Photo

Many are concerned about LLMs accessing databases, which is why local LLMs are often seen as a safer option. However, most enterprise data warehouses are cloud-based, and organizations are generally comfortable using licensed enterprise solutions. The same approach can be

ABC (@ubunta) 's Twitter Profile Photo

Data engineering is moving toward low-level programming. While many modern Data Tools now integrate with AI, complex data processing requires maximum efficiency and optimized performance, often needing fine-tuned CPU operations or GPU acceleration. Opinion - High-performance

ABC (@ubunta) 's Twitter Profile Photo

While building Chat With Databases, I realized the power of MCP Servers but controlling the entire ecosystem to manage governance and speed carefully. - Unified session handling for PostgreSQL & Snowflake with encryption - Schema knowledge base auto-injected into LLM context =

ABC (@ubunta) 's Twitter Profile Photo

When vibe coding Data 3ngineering solutions, I resist the urge to immediately jump to my comfort-zone tools. Instead, I let the LLM suggest its approach first, then ask: "Why this tool?" and "How easy will this be to modify later?" If the reasoning feels solid, I lean into the

ABC (@ubunta) 's Twitter Profile Photo

I’m running a fairly complex setup on Hetzner: ✅ Pros - Rarely any downtime — extremely stable - Kubernetes-based — scaling apps up/down is seamless - Hosting 5 live apps: including many apps n8n, Grafana, Supabase, StackGres, Fluent, RustFS, email server, and large data

ABC (@ubunta) 's Twitter Profile Photo

The open source model from OpenAI appears to underperform on programming tasks, including demo-level work. This seems unexpected given OpenAI's reputation for high-quality models

ABC (@ubunta) 's Twitter Profile Photo

Even with a paid subscription, I still can’t access GPT-5 here in Germany. Looks like the EU’s extra-strict security rules are holding things up.

ABC (@ubunta) 's Twitter Profile Photo

GPT-5’s price looks dramatically lower—what’s the catch? If the Programming specs hold up, I’ll swap my coding assistant from Claude to OpenAI.

GPT-5’s price looks dramatically lower—what’s the catch?

If the Programming specs hold up, I’ll swap my coding assistant from Claude to OpenAI.
ABC (@ubunta) 's Twitter Profile Photo

I’ve been diving deep into how GPT-5 handles real SQL code generation, and I’ve now updated ChatWithDatabase with GPT-5-mini. Highlights from my testing: - Postgres SQL generation now performs best with GPT-5 — even beating Sonnet 4. - GPT-5 excels at tool selection: in 100+

I’ve been diving deep into how GPT-5 handles real SQL code generation, and I’ve now updated  ChatWithDatabase with GPT-5-mini.

Highlights from my testing:

- Postgres SQL generation now performs best with GPT-5 — even beating Sonnet 4.
- GPT-5 excels at tool selection: in 100+
ABC (@ubunta) 's Twitter Profile Photo

GPT-5 may occasionally generate incorrect queries, but that’s no different from a PhD expert not always writing a perfect query on the first try. Coding isn’t about getting everything right in one go — it’s about iterating to build the final product, and LLMs can excel in that

ABC (@ubunta) 's Twitter Profile Photo

I'm testing a new approach for very large tables (1000s of columns) and huge schemas (1000s of tables) in my Chat with Databases app — aiming to make LLM SQL generation more reliable. 1. Filter schema before generation, only the most relevant tables/columns reach the LLM, with

ABC (@ubunta) 's Twitter Profile Photo

I'm testing a new approach for very large tables (1000s of columns) and huge schemas (1000s of tables) in my Chat with Databases app — aiming to make LLM SQL generation more reliable. 1. Filter schema before generation, only the most relevant tables/columns reach the LLM, with

ABC (@ubunta) 's Twitter Profile Photo

After self-hosting n8n.io on Kubernetes and managing 20+ production workflows for HotTechStack, I'm convinced it's the most underrated platform for building AI Data products. - n8n’s drag-and-drop interface looks simple but is capable of running enterprise-grade operations.

ABC (@ubunta) 's Twitter Profile Photo

Interesting Lessons from building AI Data systems, so far - Ship as soon as it works—don’t get stuck refactoring first. When you stumble on a working mix of model behavior and data quirks, commit it immediately. You can always optimize a working system, but you can’t optimize

ABC (@ubunta) 's Twitter Profile Photo

Ran many Text-to-SQL experiments across multiple AI models and database platforms - Both Sonnet 4 and GPT-5 delivered remarkably similar, clean code for Snowflake Data Warehouse operations, while Gemini 2.5 Pro consistently generated more sophisticated but sometimes