GPT-4 Turbo, the latest and most capable language model from OpenAI, is here.
Just eight months following the release of GPT-4, OpenAI dropped an updated model, GPT-4 Turbo, with a context window that could fit a 300-page book in a single prompt and cheaper API access.
Here are the key features of the GPT-4 Turbo:
GPT-4 Turbo is now the default model used in ChatGPT for paid users. If you have an OpenAI account with existing GPT-4 access, you can also access the new model by switching to the gpt-4–11–6-preview model on the playground.
GPT-4 Turbo is available for all paying developers to try by passing gpt-4-1106-preview in the API. Here’s a sample chat completion request in javascript:
import OpenAI from "openai";
const openai = new OpenAI();
async function main() {
 const completion = await openai.chat.completions.create({
  messages: [{ role: "system", content: "You are a helpful assistant." }],
  model: "gpt-4-1106-preview",
 });
 console.log(completion.choices[0]);
}
main();
Here’s how to do it in Python:
from openai import OpenAI
client = OpenAI()
completion = client.chat.completions.create(
 model="gpt-4-1106-preview",
 messages=[
  {"role": "system", "content": "You are a helpful assistant."},
  {"role": "user", "content": "Hello!"}
 ]
)
print(completion.choices[0].message)
As a developer, the reduced pricing is one update I am most excited about. OpenAI has reduced it by up to 3X in the input token and 2X for the output tokens. This makes the new model much more accessible for smaller developers and startups.
GPT-4 turbo API price:
Previous GPT-4 API price:
If you’re curious what a token means, they are pieces of words used for natural language processing. For English text, 1 token is approximately 4 characters, or 0.75 words.
Also, ChatGPT API access is billed separately from the ChatGPT Plus subscription. You can monitor your consumption on the usage page of your OpenAI account.
In the latest ChatGPT user interface, you might notice that the dropdown to select the tools you want to use is already gone. It was replaced with only three options: GPT-4, GPT-3.5, and Plugins.
GPT-4 Turbo now automatically picks the right tools for you.
“We heard your feedback. That model picker was extremely annoying” — Sam Altman
For example, if you ask the AI to generate an image, it is now smart enough to use Dall-E 3 to generate an image.
Overall, I am happy to see the rapid innovation OpenAI is doing with its language models. They are undoubtedly exciting, offering a range of possibilities for innovative GPT-based applications.
However, it’s interesting to reflect on OpenAI’s strategic approach. Initially, OpenAI released their API, allowing developers to build and innovate, essentially taking on the risks of early adoption and user engagement. This move proved to be a smart tactic by OpenAI, as it not only fostered a diverse ecosystem of applications but also provided them with insights into the most in-demand features.
Now, OpenAI seems to be selectively integrating these popular features directly into their platform, effectively cherry-picking the best products and services developed by the community.
OpenAI is squeezing out smaller startups and developers—like me.
So, we have no choice but to come up with new app ideas instead of competing with the big guys.
Software engineer, writer, solopreneur