[AI] Llama 3 API Example with Python

Generative AI category is a space for discussions on generative models like ChatGPT, Llama, Grok and others.
Post Reply
dev
Posts: 10
Joined: Wed Dec 20, 2023 10:21 pm

[AI] Llama 3 API Example with Python

Post by dev »

The first thing you need to do is,

Step 1: Install package groq

Code: Select all

pip install groq
or

Code: Select all

pip3 install groq

Code: Select all

% pip install groq
zsh: command not found: pip
% pip3 install groq
Collecting groq
  Using cached groq-0.5.0-py3-none-any.whl (75 kB)
...
...
Collecting annotated-types>=0.4.0
  Using cached annotated_types-0.6.0-py3-none-any.whl (12 kB)
Installing collected packages: typing-extensions, sniffio, idna, h11, exceptiongroup, distro, certifi, annotated-types, pydantic-core, httpcore, anyio, pydantic, httpx, groq
Successfully installed
Step 2: Generate your Groq API Key

Next, generate your API Key at https://console.groq.com/keys

Step 3: Add your Groq API Key to the Environment Variable.

Code: Select all

export GROQ_API_KEY=gsk_6sVS2dqG8szzxxxxxxxxWGdyb3FYwPpuRTdJxxxxxxx5wrVSM
Step 4: Python Code to Call Llama 3 models.

Code: Select all

import os

from groq import Groq

client = Groq(
    api_key=os.environ.get("GROQ_API_KEY"),
)

response = client.chat.completions.create(
    messages=[
        {
            "role": "user",
            "content": "Name the capital of France",
        }
    ],
    model="llama3-8b-8192",
)

print(response.choices[0].message.content)
Step 5: Test your example.

Code: Select all

% python3 example.py                                                          
The capital of France is Paris.
In this example, we used the LLama3 8b model if you want to use LLaMA3 70b use the model name as llama3-70b-8192

Read More: https://console.groq.com/docs/models
Tech Junkie!
Post Reply