Diyelim ki bir ChatGPT uygulaması yapıyoruz ve kullanıcı '25 dolar kaç ₺ eder' diye sordu. Bu durumda back-end'imizde 'currency_rate(symbol: string)' olduğunu düşünelim. Yani `currency_rate('USDTRY')` çağırınca, bize 23.60 dönen bir fonksiyonumuz var.
Not to overhype the new OpenAI API's too much, it looks like it can still hallucinate invalid JSON & parameters.
I thought there would be some sort of built in API lv guardrails to auto enforce JSON shape & parameters. You'll still need to implement application side validation.
Of of of of güncellemeye gel, muhteşem. Artık OpenAI API’de “function” tanımlanabiliyor. Tanımladığınız fonksiyonları prompt içinden nasıl çağıracağını kendisi belirliyor ve size function_call türünde bir message türü hazırlayıp dönüyor. Bunu kendiniz backend’inizde…
yep, it works. here is an example of extracting structured data from an LAPD newsroom article. seems like it needs fewer tokens to get the same quality of results. gist.github.com/kylemcdonald/d… left: input, right: output
This is a really clever way of using the new OpenAI Functions mechanism, without actually executing any functions. Instead, Kyle is using it to provide as schema describing data to scrape from an article - then letting a ChatGPT API call return that exact scraped data as JSON.
I used to write Java at my prev job, I feel bad for the Java software engineers. Environment so heavy, war files? And frameworks so heavy, it’s just http like an endpoint that takes some input and returns some JSON. Everything is a subclass of a subclass, GC sucks, compiling slow
the things you need to understand about people getting excited about AI who are building things are actually just excited that they can transform tweets into JSON