Mohsan (@mozabbas) 's Twitter Profile
Mohsan

@mozabbas

Software engineer, designing data systems that turn real-time streams into actionable insights. Passionate about transforming raw data into impactful analytics.

ID: 2899408428

calendar_today14-11-2014 14:19:59

1,1K Tweet

193 Followers

599 Following

Programming Wisdom (@codewisdom) 's Twitter Profile Photo

"The mark of a mature programmer is willingness to throw out code you spent time on when you realize it's pointless." - Bram Cohen

khaled anam خالد انعم (@khaledanam1) 's Twitter Profile Photo

Congratulations to Admiral General, Supreme Commander, Prime Minister, Chief Justice, President, Chief Economist, Election Commissioner, Chief Chancellor, Executive Producer, Patron in Chief, National Chairman and Spiritual Guide of the Galaxy... His Excellency, Mr

Congratulations to Admiral General, Supreme Commander, Prime Minister, Chief Justice, President, Chief Economist, Election Commissioner, Chief Chancellor, Executive Producer, Patron in Chief, National Chairman and Spiritual Guide of the Galaxy... His Excellency, Mr
Dipankar Mazumdar (@dipankartnt) 's Twitter Profile Photo

Production issues in a Data Lakehouse 🧵 Choosing an open table format like Apache Hudi, Iceberg, or Delta Lake is just the beginning. Once data lands in object stores like S3/GCS, real-world problems surface. Let’s walk through some common ones (and how to handle them) 👇

Production issues in a Data Lakehouse 🧵

Choosing an open table format like <a href="/apachehudi/">Apache Hudi</a>, Iceberg, or Delta Lake is just the beginning.

Once data lands in object stores like S3/GCS, real-world problems surface. Let’s walk through some common ones (and how to handle them) 👇
Stanislav Kozlovski (@bdkozlovski) 's Twitter Profile Photo

The largest Kafka you've never heard about: Datadog 🐶 • hundreds of TRILLIONS of messages a day • terabytes a second of traffic - at least 10x more than Uber's 🔥 • >10,000 brokers in 600 Kafka clusters • >1,000,000 partitions over thousands of topics - e.g. 7000 topics

The largest Kafka you've never heard about: Datadog 🐶

• hundreds of TRILLIONS of messages a day
• terabytes a second of traffic - at least 10x more than Uber's 🔥
• &gt;10,000 brokers in 600 Kafka clusters
• &gt;1,000,000 partitions over thousands of topics - e.g. 7000 topics
Ivan Velichko (@iximiuz) 's Twitter Profile Photo

How Container Networking Works 🧐 Most Docker installations and Kubernetes clusters have the same bridge container network setup. The best way to understand how it works? Try reproducing one from scratch using nothing but the standard Linux commands: labs.iximiuz.com/tutorials/cont…

How Container Networking Works 🧐

Most Docker installations and Kubernetes clusters have the same bridge container network setup. The best way to understand how it works? Try reproducing one from scratch using nothing but the standard Linux commands: labs.iximiuz.com/tutorials/cont…
GopherCon (@gophercon) 's Twitter Profile Photo

🌾 Data everywhere? Time to build pipelines that can handle the load. In her talk, Mindy will share how her team processes complex data flows from labs, fields, and applications—using Go’s parametric polymorphism to build fast, flexible systems. You’ll walk away with insights

🌾 Data everywhere? Time to build pipelines that can handle the load.

In her talk, Mindy will share how her team processes complex data flows from labs, fields, and applications—using Go’s parametric polymorphism to build fast, flexible systems.

You’ll walk away with insights
Stanislav Kozlovski (@bdkozlovski) 's Twitter Profile Photo

Fundamental Kafka Consumer concepts everybody should know because they’re unlikely to change soon (at all) 💡 Consumers are literally just libraries that read data from Kafka via the KafkaConsumer class. A microservice (for example) imports the library and hooks it up alongside

Fundamental Kafka Consumer concepts everybody should know because they’re unlikely to change soon (at all)

💡 Consumers are literally just libraries that read data from Kafka via the KafkaConsumer class. A microservice (for example) imports the library and hooks it up alongside
TechOps Examples (@techopsexamples) 's Twitter Profile Photo

Cloud Disaster Recovery Strategies 👇 Any DR strategy starts with finalizing: 𝟭. 𝗥𝗧𝗢 (𝗥𝗲𝗰𝗼𝘃𝗲𝗿𝘆 𝗧𝗶𝗺𝗲 𝗢𝗯𝗷𝗲𝗰𝘁𝗶𝘃𝗲): How much downtime one can accept ? 𝟮. 𝗥𝗣𝗢 (𝗥𝗲𝗰𝗼𝘃𝗲𝗿𝘆 𝗣𝗼𝗶𝗻𝘁 𝗢𝗯𝗷𝗲𝗰𝘁𝗶𝘃𝗲): How much data loss one can accept ?

Cloud Disaster Recovery Strategies 👇

Any DR strategy starts with finalizing:

𝟭. 𝗥𝗧𝗢 (𝗥𝗲𝗰𝗼𝘃𝗲𝗿𝘆 𝗧𝗶𝗺𝗲 𝗢𝗯𝗷𝗲𝗰𝘁𝗶𝘃𝗲):
How much downtime one can accept ?

𝟮. 𝗥𝗣𝗢 (𝗥𝗲𝗰𝗼𝘃𝗲𝗿𝘆 𝗣𝗼𝗶𝗻𝘁 𝗢𝗯𝗷𝗲𝗰𝘁𝗶𝘃𝗲):
How much data loss one can accept ?
Phuong Le (@func25) 's Twitter Profile Photo

Container-aware GOMAXPROCS is enabled by default in Go 1.25 on Linux systems. If you don’t know what the problem with GOMAXPROCS is, read this: victoriametrics.com/blog/kubernete… Here is something you need to know about it: 1. The feature only activates when the Go program is running

Container-aware GOMAXPROCS is enabled by default in Go 1.25 on Linux systems.

If you don’t know what the problem with GOMAXPROCS is, read this: victoriametrics.com/blog/kubernete…

Here is something you need to know about it:

1. The feature only activates when the Go program is running
Stanislav Kozlovski (@bdkozlovski) 's Twitter Profile Photo

Most people either don't realize this (or have forgotten): Kafka was created to solve a data integration problem. Schemas were of prime importance. 🧵

Most people either don't realize this (or have forgotten):

Kafka was created to solve a data integration problem.

Schemas were of prime importance.
🧵
Dr Milan Milanović (@milan_milanovic) 's Twitter Profile Photo

𝗪𝗵𝗮𝘁 𝗶𝘀 𝗘𝗧𝗟? Extract, Transform, Load (ETL) is a data integration process that involves: 𝟭. 𝗘𝘅𝘁𝗿𝗮𝗰𝘁 This step involves extracting data from various heterogeneous sources. These sources include databases, flat files, APIs, or other data storage mechanisms. 𝟮.

𝗪𝗵𝗮𝘁 𝗶𝘀 𝗘𝗧𝗟?

Extract, Transform, Load (ETL) is a data integration process that involves:

𝟭. 𝗘𝘅𝘁𝗿𝗮𝗰𝘁

This step involves extracting data from various heterogeneous sources. These sources include databases, flat files, APIs, or other data storage mechanisms.

𝟮.