Andrew Ng(@AndrewYNg) 's Twitter Profileg
Andrew Ng

@AndrewYNg

Co-Founder of Coursera; Stanford CS adjunct faculty. Former head of Baidu AI Group/Google Brain. #ai #machinelearning, #deeplearning #MOOCs

ID:216939636

linkhttp://www.andrewng.org calendar_today18-11-2010 03:39:11

1,6K Tweets

1,0M Followers

928 Following

Follow People
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

A good way to get started in AI is to start with coursework, which gives a systematic way to gain knowledge, and then to work on projects. For many who hear this advice, “projects” may evoke a significant undertaking that delivers value to users. But I encourage you to set a

A good way to get started in AI is to start with coursework, which gives a systematic way to gain knowledge, and then to work on projects. For many who hear this advice, “projects” may evoke a significant undertaking that delivers value to users. But I encourage you to set a
account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

Learn to deploy AI models to edge devices in our new short course Introduction to On-Device AI, created with Qualcomm and taught by Senior Director of Engineering Krishna Sridhar.

I think on-device (edge) AI is an important technology trend that's enabling new low latency,

account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

This week, Google announced a doubling of Gemini Pro 1.5's input context window from 1 million to 2 million tokens, and OpenAI released GPT-4o, which generates tokens 2x faster and 50% cheaper than GPT-4 Turbo and natively accepts and generates multimodal tokens. I view these

account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

New agentic short course! Multi AI Agent Systems with crewAI, built with crewAI's founder and CEO João Moura. In this course, you'll learn how to break down complex tasks into subtasks for multiple AI agents, each playing a specialized role, to execute.

For example, to

account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

Congratulations to all my Google friends for the cool announcements at I/O!

I'm personally looking forward to Gemini with 2 million token input context window and better support for on-device AI -- should open up new opportunities for application builders!

account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

Congrats to OpenAI for the release of GPT-4o! 2x faster and 50% cheaper tokens will be great for everyone using agentic AI workflows.

When an agentic job that used to take 10min now takes 5min just by switching APIs, that's great progress!

account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

New short course: Building Multimodal Search and RAG', by Weaviate • vector database's Sebastian {W}italec ✊🏽✊🏾✊🏿.

Contrastive learning is used to train models to map vectors into an embedding space by pulling similar concepts closer together and pushing dissimilar concepts away from each other. This

account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

Last week, I spoke about AI and regulations at an event at the U.S. Capitol attended by legislative and business leaders. I’m encouraged by the progress the open source community has made fending off regulations that would have stifled innovation. But opponents of open source are

account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

I’m excited to kick off the first of our short courses focused on agents, starting with Building Agentic RAG with LlamaIndex, taught by Jerry Liu, CEO of LlamaIndex 🦙.

This covers an important shift in RAG (retrieval augmented generation), in which rather than having the

account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

Have you used quantization with an open source machine learning library, and wondered how quantization works? How can you preserve model accuracy as you compress from 32 bits to 16, 8, or even 2 bits? In our new short course, Quantization in Depth, taught by Hugging Face's

account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

Inexpensive token generation and agentic workflows for large language models (LLMs) open up intriguing new possibilities for training LLMs on synthetic data. Pretraining an LLM on its own directly generated responses to prompts doesn't help. But if an agentic workflow implemented

account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

Chatting with Groq Inc’s CEO Jonathan Ross. Groq has super fast token generation capabilities now. And, I was excited also to hear about his plans to scale up capacity aggressively and also expand this to other models than just LLMs! This is a good time to be building AI

Chatting with @GroqInc’s CEO @JonathanRoss321. Groq has super fast token generation capabilities now. And, I was excited also to hear about his plans to scale up capacity aggressively and also expand this to other models than just LLMs! This is a good time to be building AI
account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

In Prompt Engineering for Vision Models, taught by Abby Jacques Verre and Caleb Kaiser of Comet , you’ll learn how to prompt and fine-tune vision models for personalized image generation, image editing, object detection and segmentation. The prompts you'll use for

account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

I've really enjoyed using crewAI 's tools to build multiagent AI systems -- in addition to being productive, it's also fun to use! It was great hanging out with its creator João Moura to chat about best practices for building agentic workflows.

I've really enjoyed using @crewAIInc 's tools to build multiagent AI systems -- in addition to being productive, it's also fun to use! It was great hanging out with its creator @joaomdmoura to chat about best practices for building agentic workflows.
account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

Much has been said about many companies’ desire for more compute (as well as data) to train larger foundation models. I think it’s under-appreciated that we have nowhere near enough compute available for inference on foundation models as well.

Years ago, when I was leading teams

account_circle
Andrew Ng(@AndrewYNg) 's Twitter Profile Photo

New short course with Mistral AI !

Mistral's open-source Mixtral 8x7B model uses a 'mixture of experts' (MoE) architecture. Unlike a standard transformer, an MoE model has multiple expert feed-forward networks (8 in this case), with a gating network selecting two experts at

account_circle