Otina Brian (@justotina) 's Twitter Profile
Otina Brian

@justotina

Data Engineer cooking

ID: 1264583350729289729

linkhttps://github.com/otinabrayo calendar_today24-05-2020 15:45:57

1,1K Tweet

815 Followers

828 Following

Otina Brian (@justotina) 's Twitter Profile Photo

Learning Never stops 🔂 I took a deep dive into Data Engineering Fundamentals I explored Azure Cloud covering DataLakes, ETL pipelines, dimensional modeling and Bigdata processing with spark. AmExcited to see how these technologies come together to build a scalable pipeline 🌍

Learning Never stops 🔂
I took a deep dive into Data Engineering Fundamentals 
I explored Azure Cloud covering DataLakes, ETL pipelines, dimensional modeling and  Bigdata processing with spark.
AmExcited to see how these technologies come together to build a scalable pipeline 🌍
Otina Brian (@justotina) 's Twitter Profile Photo

Delighted to share my PySpark journey so far! The past 3 days have been spent immersing myself in some of the important concepts, starting from ingestion to complex transformations and optimizations.

Delighted to share my PySpark journey so far! The past 3 days have been spent immersing myself in some of the important concepts, starting from ingestion to complex transformations and optimizations.
Raul Junco (@rauljuncov) 's Twitter Profile Photo

My Kafka is better than yours. It doesn’t need Zookeeper. It doesn’t hog disk space. It doesn’t wake me up at 2AM. Okay, full disclosure, it’s not Kafka. It’s Bufstream: a Kafka-compatible queue that’s ~8x cheaper to operate. Been testing Bufstream recently, and it flips the

My Kafka is better than yours.

It doesn’t need Zookeeper.
It doesn’t hog disk space.
It doesn’t wake me up at 2AM.

Okay, full disclosure, it’s not Kafka.
It’s Bufstream: a Kafka-compatible queue that’s ~8x cheaper to operate.

Been testing Bufstream recently, and it flips the
Otina Brian (@justotina) 's Twitter Profile Photo

I Just finished my data pipeline. Where i automated live Exchange rates from API, Created custom Operators to fetch, transfer and load data to S3. Sql queries to transform the data to relational format in Snowflake. Also created DAGs alerted to mail 👇🏽 github.com/otinabrayo/Air…

I Just finished my data pipeline.
Where i automated  live Exchange rates from API, 
Created custom Operators to  fetch, transfer and load data to S3. Sql queries to transform the data to relational format in Snowflake.
Also created DAGs alerted to mail
👇🏽
github.com/otinabrayo/Air…