LiteLLM (YC W23) (@litellm) 's Twitter Profile
LiteLLM (YC W23)

@litellm

Call every LLM API like it's OpenAI 👉 github.com/BerriAI/litellm

ID: 1607849281280671744

linkhttps://github.com/BerriAI/litellm calendar_today27-12-2022 21:22:10

840 Tweet

3,3K Takipçi

165 Takip Edilen

Ishaan (@ishaan_jaff) 's Twitter Profile Photo

LiteLLM (YC W23) v1.68.2-nightly brings support for sending email invites to users you invite to LiteLLM This release brings the following improvements - Support for sending emails when a user is invited to the platform - Support for sending emails when a key is created for a user

<a href="/LiteLLM/">LiteLLM (YC W23)</a>  v1.68.2-nightly brings support for sending email invites to users you invite to LiteLLM

This release brings the following improvements

- Support for sending emails when a user is invited to the platform

- Support for sending emails when a key is created for a user
Ishaan (@ishaan_jaff) 's Twitter Profile Photo

LiteLLM (YC W23) v1.68.2-nightly brings support for using AWS Bedrock Guardrails PII Masking with LiteLLM This allows you to run your Bedrock PII masking guardrails with 100+ LLMs on LiteLLM. Start here: docs.litellm.ai/docs/proxy/gua…

<a href="/LiteLLM/">LiteLLM (YC W23)</a>  v1.68.2-nightly brings support for using AWS Bedrock Guardrails PII Masking with LiteLLM

This allows you to run your Bedrock PII masking guardrails with 100+ LLMs on LiteLLM. Start here: docs.litellm.ai/docs/proxy/gua…
Roo Code (@roo_code) 's Twitter Profile Photo

Roo Code v3.16 introduces LiteLLM (YC W23) integration, enabling seamless access to over 100 language models via automatic discovery. This enhancement simplifies model management and expands your AI toolkit. Explore all the new features and improvements in the full notes: 🔗

Ishaan (@ishaan_jaff) 's Twitter Profile Photo

LiteLLM (YC W23) v1.68.3-nightly brings support for Nscale Use Nscale for full data sovereignty and compliance with European regulations Start with Nscale here: docs.litellm.ai/docs/providers…

<a href="/LiteLLM/">LiteLLM (YC W23)</a>  v1.68.3-nightly brings support for <a href="/nscale_cloud/">Nscale</a>

Use Nscale for full data sovereignty and compliance with European regulations 

Start with Nscale here: docs.litellm.ai/docs/providers…
Ishaan (@ishaan_jaff) 's Twitter Profile Photo

LiteLLM (YC W23) v1.69.2-nightly brings support for using @google ADK (Agent Developer Kit) with LiteLLM Python SDK & LiteLLM Proxy Start here: docs.litellm.ai/docs/tutorials…

<a href="/LiteLLM/">LiteLLM (YC W23)</a>  v1.69.2-nightly brings support for using @google ADK (Agent Developer Kit) with LiteLLM Python SDK &amp; LiteLLM Proxy

Start here: docs.litellm.ai/docs/tutorials…
Ishaan (@ishaan_jaff) 's Twitter Profile Photo

LiteLLM (YC W23) v1.69.3-nightly brings major improvements to our Microsoft Presidio PII Integration, this release brings support for configuring PII Entities and actions. Get started with entity configuration here: docs.litellm.ai/docs/proxy/gua…

<a href="/LiteLLM/">LiteLLM (YC W23)</a>  v1.69.3-nightly brings major improvements to our <a href="/Microsoft/">Microsoft</a> Presidio PII Integration, this release brings support for configuring PII Entities and actions.

Get started with entity configuration here: docs.litellm.ai/docs/proxy/gua…
Ishaan (@ishaan_jaff) 's Twitter Profile Photo

LiteLLM (YC W23) v1.69.3-nightly brings support for configuring PII Entities and their actions on LiteLLM UI. This means that you can use LiteLLM UI to control what PII entities to mask vs block

<a href="/LiteLLM/">LiteLLM (YC W23)</a>  v1.69.3-nightly brings support for configuring PII Entities and their actions on LiteLLM UI.

This means that you can use LiteLLM UI to control what PII entities to mask vs block
Ishaan (@ishaan_jaff) 's Twitter Profile Photo

Thrilled to launch support for adding Guardrails on LiteLLM (YC W23) UI This release brings support for adding Microsoft Presidio, AWS Bedrock Guardrails, Protect AI LLM Guard Endpoints, AIM Guardrails, Lakera Guardrails on LiteLLM

Thrilled to launch support for adding Guardrails on <a href="/LiteLLM/">LiteLLM (YC W23)</a> UI

This release brings support for adding Microsoft Presidio, AWS Bedrock Guardrails, <a href="/ProtectAICorp/">Protect AI</a>  LLM Guard Endpoints, AIM Guardrails, <a href="/LakeraAI/">Lakera</a>  Guardrails on LiteLLM
Ishaan (@ishaan_jaff) 's Twitter Profile Photo

LiteLLM (YC W23) v1.70.0-nightly brings major improvements for PII, PHI masking use cases. With this release you can do the following: - Configuring PII masking entities and their action on the LiteLLM UI - eg, you can set a guardrail to block all CREDIT_CARD entities

<a href="/LiteLLM/">LiteLLM (YC W23)</a>  v1.70.0-nightly brings major improvements for PII, PHI masking use cases.

With this release you can do the following:

- Configuring PII masking entities and their action on the LiteLLM UI - eg, you can set a guardrail to block all CREDIT_CARD entities
Ishaan (@ishaan_jaff) 's Twitter Profile Photo

Thrilled to launch Day-0 support for using MCP tools with the OpenAI responses API on LiteLLM (YC W23) With our latest release you can pass tools of type mcp through LiteLLM

Thrilled to launch Day-0 support for using MCP tools with the <a href="/OpenAI/">OpenAI</a> responses API on <a href="/LiteLLM/">LiteLLM (YC W23)</a> 

With our latest release you can pass tools of type mcp through LiteLLM
Ishaan (@ishaan_jaff) 's Twitter Profile Photo

Thrilled to launch Day 0 support for Anthropic Claude-4 on LiteLLM (YC W23) v1.70.4 brings support Anthropic Claude-4 models on Anthropic API, Google VertexAI and AWS Bedrock Start here: docs.litellm.ai/docs/providers…

Thrilled to launch Day 0 support for <a href="/AnthropicAI/">Anthropic</a>  Claude-4 on <a href="/LiteLLM/">LiteLLM (YC W23)</a>  v1.70.4 brings support Anthropic Claude-4 models on Anthropic API, Google VertexAI and AWS Bedrock

Start here: docs.litellm.ai/docs/providers…
Ishaan (@ishaan_jaff) 's Twitter Profile Photo

LiteLLM (YC W23) v1.70.5-nightly will have a 94% faster median response time and 350% higher RPS You can read more about the change here: github.com/BerriAI/litell…

Ishaan (@ishaan_jaff) 's Twitter Profile Photo

⚡️ LiteLLM (YC W23) v1.72.0-nightly brings major performance improvements to LiteLLM. This release brings aiohttp support for all LLM api providers. This means that LiteLLM can now scale to 200 RPS per instance with a 40ms median latency overhead. Improvements on this release👇:

⚡️ <a href="/LiteLLM/">LiteLLM (YC W23)</a>  v1.72.0-nightly brings major performance improvements to LiteLLM. 

This release brings aiohttp support for all LLM api providers. 

This means that LiteLLM can now scale to 200 RPS per instance with a 40ms median latency overhead.

Improvements on this release👇:
Ishaan (@ishaan_jaff) 's Twitter Profile Photo

Thrilled to launch the ability to add MCP Servers on LiteLLM UI on LiteLLM (YC W23) v1.71.3-nightly This means you can add your MCP SSE Server URLs on LiteLLM and list + test the tools available on the LiteLLM UI

Thrilled to launch the ability to add MCP Servers on LiteLLM UI on <a href="/LiteLLM/">LiteLLM (YC W23)</a>  v1.71.3-nightly 

This means you can add your MCP SSE Server URLs on LiteLLM and list + test the tools available on the LiteLLM UI
Qdrant (@qdrant_engine) 's Twitter Profile Photo

🚀 Learn how to build modular RAG pipelines that boost answer quality with smart re-ranking in the latest tutorial by ManthaPavanKumar. ➡️ Re-rankers from cohere, ColBERT, Jina AI, and Voyage AI by MongoDB ➡️ Easy LLM switching with @litellm ➡️ Full observability and trace tracking using

🚀 Learn how to build modular RAG pipelines that boost answer quality with smart re-ranking in the latest tutorial by <a href="/pavan_mantha1/">ManthaPavanKumar</a>.

➡️ Re-rankers from <a href="/cohere/">cohere</a>, ColBERT, <a href="/JinaAI_/">Jina AI</a>, and <a href="/VoyageAI/">Voyage AI by MongoDB</a>
➡️ Easy LLM switching with @litellm
➡️ Full observability and trace tracking using
Ishaan (@ishaan_jaff) 's Twitter Profile Photo

Thrilled to launch support for Amazon Web Services Bedrock Agents on LiteLLM (YC W23) This means that you can now call all your Bedrock Agents in the OpenAI Request/Response format Start here: docs.litellm.ai/docs/providers…

Thrilled to launch support for <a href="/awscloud/">Amazon Web Services</a> Bedrock Agents on <a href="/LiteLLM/">LiteLLM (YC W23)</a> 

This means that you can now call all your Bedrock Agents in the OpenAI Request/Response format

Start here: docs.litellm.ai/docs/providers…
Ishaan (@ishaan_jaff) 's Twitter Profile Photo

LiteLLM (YC W23) v1.72.2-nightly brings support for using Bedrock Models with Anthropic Claude Code. This means that you can now use your Bedrock Models through LiteLLM Proxy and track usage on Claude Code

<a href="/LiteLLM/">LiteLLM (YC W23)</a>  v1.72.2-nightly brings support for using Bedrock Models with <a href="/AnthropicAI/">Anthropic</a> Claude Code. 

This means that you can now use your Bedrock Models through LiteLLM Proxy and track usage on Claude Code
Avi Chawla (@_avichawla) 's Twitter Profile Photo

Claude Sonnet 4 is the newest reasoning model by Anthropic. Today, let's compare it with OpenAI o4 on coding tasks. We'll use: - LiteLLM (YC W23) for orchestration. - DeepEval for evaluation (open-source). - Anthropic Sonnet 4 and OpenAI o4 as LLMs. Let's dive in!