파일방2026. 2. 4. 21:57

어짜다 우연히(?)

EXAONE-3.5-7.8B-Instruct-AWQ

라는 키워드로 검색.

 

Quickstart
We recommend to use transformers>=4.43 and autoawq>=0.2.7.post3.

Here is the code snippet to run conversational inference with the model:

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct-AWQ"

model = AutoModelForCausalLM.from_pretrained(
    model_name,
    torch_dtype=torch.bfloat16,
    trust_remote_code=True,
    device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained(model_name)

# Choose your prompt
prompt = "Explain how wonderful you are"  # English example
prompt = "스스로를 자랑해 봐"       # Korean example

messages = [
    {"role": "system", 
     "content": "You are EXAONE model from LG AI Research, a helpful assistant."},
    {"role": "user", "content": prompt}
]
input_ids = tokenizer.apply_chat_template(
    messages,
    tokenize=True,
    add_generation_prompt=True,
    return_tensors="pt"
)

output = model.generate(
    input_ids.to("cuda"),
    eos_token_id=tokenizer.eos_token_id,
    max_new_tokens=128,
    do_sample=False,
)
print(tokenizer.decode(output[0]))

[링크 : https://huggingface.co/LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct-AWQ]

[링크 : https://github.com/LG-AI-EXAONE/EXAONE-3.5]

[링크 : https://www.lgresearch.ai/blog/view?seq=506]

[링크 : https://wikidocs.net/274703]

'파일방' 카테고리의 다른 글

podman  (0) 2026.01.25
elinks  (0) 2026.01.20
platformio  (0) 2026.01.17
directFB2  (0) 2026.01.09
nanoVG  (0) 2026.01.09
Posted by 구차니