About Some Hyped ML stuff

Honestly, I know most of us is pretty much fed up of reading about Chat-GPTx.

Myself, I tried not to jump on it too soon just because everyone says it's cool.

Also, there are dark sides.

But, to keep it technical, if you have to try it with some textual prompt, of course you want a way to run those in the terminal.

Of course there are dozens of Python version X implementations.

As it turns out, most of them are pretty awful: they either have impossible, brittle dependencies, or end up being only interactive.

I despise having to install a virtualenv for something as simple as an api call, it already sounds like a warning in itself.

Half an hour spent trying to install some of this stuff, and I was back typing in the browser...

It does not need to be so difficult. If you ask to GPT itself you get an answer that seems correct, it's just that it does not work.

This is the way to do it in a simple shell:

#! /bin/bash

source  ~/.chagpt.cfg

workfile=$(mktemp /tmp/chatgpt-XXXXXXXX)
curl -s --location --insecure --request POST 'https://api.openai.com/v1/chat/completions' \
--header "Authorization: Bearer ${api_key}" \
--header 'Content-Type: application/json' \
--data-raw "{
 \"model\": \"gpt-3.5-turbo\",
 \"messages\": [{\"role\": \"user\", \"content\": \"$@\"}]
}" |\
   jq '.choices[].message.content'\
   > ${workfile} 

response=$(cat ${workfile})
echo -e $response

The only dependencies are:

  • curl
  • jq

It does not look like asking much.

Needless to say, in ~/.chagpt.cfg you will have a config in the format:


In order to get your Api key, apparently you need to go here

[api] [python]