Skip To Main Content
Support Knowledge Base

Obtained Unexpected Results When Running Ollama-OpenVino Docker with Llama-2-7b-chat-hf Model

Content Type: Troubleshooting   |   Article ID: 000102497   |   Last Reviewed: 02/13/2026

Environment

Operating System

Ubuntu 22.04

Description

  • Downloaded and converted Llama-2-7b-chat-hf model via Optimum Intel.
    optimum-cli export openvino --model "meta-llama/Llama-2-7b-chat-hf" --weight-format fp16 --trust-remote-code "Llama-2-7b-chat-hf"
  • Ran Ollama-OpenVino docker with Llama-2-7b-chat-hf model.
  • No tokens were generated.

Resolution

Download and replace with the Llama-2-7b-chat-hf-int4-asym-ov model.

Related Products

This article applies to 1 products.