1 ## MavenStyleProject using code reflection with a Java-based ONNX programming model.
 2 
 3 ### ONNX Runtime running convolution neural network from Java source
 4 
 5 Running the MNIST demo:
 6 ```
 7 mvn process-test-classes exec:exec -Dexec.executable=<path to the Babylon JDK home>/bin/java -Dexec.mainClass=oracle.code.onnx.mnist.MNISTDemo
 8 ```
 9 
10 ### ONNX GenAI running large language model from Java source.
11 
12 Setup:
13  - Download [onnxruntime-genai](https://github.com/microsoft/onnxruntime-genai/releases) native library coresponding to your system/architecture, unzip and put it into `cr-examples/onnx/lib` folder.
14  - Download `model.onnx.data`, `tokenizer.json` and `tokenizer_config.json` data files from [Llama-3.2-1B-Instruct-ONNX](https://huggingface.co/onnx-community/Llama-3.2-1B-Instruct-ONNX/tree/main/cpu_and_mobile/cpu-int4-rtn-block-32-acc-level-4) and put them into `cr-examples/onnx/src/test/resources/oracle/code/onnx/llm` folder.
15 
16 Running the Llama demo:
17 ```
18 mvn process-test-classes exec:exec -Dexec.executable=<path to the Babylon JDK home>/bin/java -Dexec.mainClass=oracle.code.onnx.llm.LlamaDemo
19 ```
20