Kal (@kalamiai) 's Twitter Profile
Kal

@kalamiai

Trying to build 10 AI apps in 2024. 1.5/10 done

ID: 1679313288645996548

linkhttp://kalami.ai calendar_today13-07-2023 02:14:33

138 Tweet

635 Followers

2,2K Following

UNSTREET (@unstreet_) 's Twitter Profile Photo

🎊実物鑑定スタート記念第1弾🎊 👟Nike Air Jordan 1 “Lost & Found”を 3名様にお好きなサイズでプレゼント‼️ 📱応募方法 ①ツイッター UNSTREETチェックグッズ -CheckGoods- の両アカウントをフォロー ②このツイートをRT ③UNSTREETアプリDL&会員登録 ⏰応募期間 11/27 23:59まで

🎊実物鑑定スタート記念第1弾🎊
  
👟Nike Air Jordan 1 “Lost & Found”を
3名様にお好きなサイズでプレゼント‼️

📱応募方法
①ツイッター <a href="/unstreet_/">UNSTREET</a> と <a href="/CheckGoods/">チェックグッズ -CheckGoods-</a> の両アカウントをフォロー
②このツイートをRT
③UNSTREETアプリDL&amp;会員登録

⏰応募期間
11/27 23:59まで
John Cutler (@johncutlefish) 's Twitter Profile Photo

Your team is burnt out. They are not getting anything done. Work is "low quality". You can see and feel those things. But what you are seeing is an output of something—the downstream effects of other things happening. In some companies this is a black box 1/n

Your team is burnt out. They are not getting anything done. Work is "low quality". You can see and feel those things.

But what you are seeing is an output of something—the downstream effects of other things happening. 

In some companies this is a black box 

1/n
Crypto Adventure (@cryptoadventure) 's Twitter Profile Photo

🗞️ #PR: Neurahub Presents New Telegram App Powered by Generative AI Technology ⁦🧠Neurahub⁩ , a leading generative #AI startup, has recently announced the imminent launch of their revolutionary #Telegram app. #crypto #defi #nfts #nft #metaverse cryptoadventure.com/neurahub-prese…

Unify (@letsunifyai) 's Twitter Profile Photo

Paper Reading Announcement 📢 This week we are joined by Yuxian Gu who is presenting his work - Knowledge Distillation of Large Language Models. This paper presents a knowledge distillation approach tailored for language models, utilizing reverse Kullback-Leibler divergence

Paper Reading Announcement 📢 

This week we are joined by <a href="/gu_yuxian/">Yuxian Gu</a> who is presenting his work - Knowledge Distillation of Large Language Models.

This paper presents a knowledge distillation approach tailored for language models, utilizing reverse Kullback-Leibler divergence
Kal (@kalamiai) 's Twitter Profile Photo

It's a journey building products and features. It's hard to tell if what you're building is useful, needed or even wanted. Especially in a brand new field. That doesn't mean you don't try to iterate or improve, it just means you get a little bit of feedback before you go all in.