1 ## MavenStyleProject using code reflection with a Java-based ONNX programming model.
 2 
 3 ### ONNX Runtime running convolution neural network from Java source
 4 
 5 Running the MNIST demo:
 6 ```
 7 JAVA_HOME=<path to the Babylon JDK home>
 8 mvn process-test-classes exec:java -Dexec.mainClass=oracle.code.onnx.mnist.MNISTDemo
 9 ```
10 
11 ### ONNX Runtime with CoreML running facial emotion recognition from Java source.
12 
13 Download the `.data` file from [emotion-ferplus-8.onnx.data](https://github.com/ammbra/fer-model-weights/raw/refs/heads/main/emotion-ferplus-8.onnx.data) and place it under `cr-examples/onnx/src/test/resources/oracle/code/onnx/fer` folder.
14 
15 Running the FER demo:
16 ```
17 JAVA_HOME=<path to the Babylon JDK home>
18 mvn process-test-classes exec:java -Dexec.mainClass=oracle.code.onnx.fer.FERCoreMLDemo
19 ```
20 
21 #### How to (Re)Generate the CoreML Java Bindings
22 
23 The following instructions are for Mac users only as the CoreML Execution Provider (EP) requires iOS devices with iOS 13 or higher, or Mac computers with macOS 10.15 or higher.
24 Build and install custom ONNX Runtime with CoreML enabled:
25 
26 ```
27 git clone --recursive https://github.com/microsoft/onnxruntime.git
28 cd onnxruntime
29 ./build.sh --config Release --build_shared_lib --use_coreml --parallel
30 # get the path to where current built library is available
31 pwd
32 ```
33 
34 Inside `cr-examples/onnx/opgen` project you will find the `setup.sh` script that takes as argument the path to your cloned `onnxruntime` and uses `jextract` to regenerate the binaries.
35 Prior to running it make sure that `jextract` is in your system `$PATH` :
36 
37 ```shell
38 jextract --version
39 ```
40 Provide the path to your cloned `onnxruntime` and the script will regenerate the CoreML Java bindings inside the `oracle.code.onnx.foreign`:
41 
42 ```
43 sh setup.sh path/to/cloned/onnxruntime
44 ```
45 
46 ### ONNX GenAI running large language model from Java source.
47 
48 Setup:
49 - Download [onnxruntime-genai](https://github.com/microsoft/onnxruntime-genai/releases) native library coresponding to your system/architecture, unzip and put it into `cr-examples/onnx/lib` folder.
50 - Download `model.onnx.data`, `tokenizer.json` and `tokenizer_config.json` data files from [Llama-3.2-1B-Instruct-ONNX](https://huggingface.co/onnx-community/Llama-3.2-1B-Instruct-ONNX/tree/main/cpu_and_mobile/cpu-int4-rtn-block-32-acc-level-4) and put them into `cr-examples/onnx/src/test/resources/oracle/code/onnx/llm` folder.
51 
52 Running the Llama demo:
53 ```
54 JAVA_HOME=<path to the Babylon JDK home>
55 mvn process-test-classes exec:java -Dexec.mainClass=oracle.code.onnx.llm.LlamaDemo
56 ```