Skip to content

Instantly share code, notes, and snippets.

@jaypeche
Created November 2, 2025 20:07
Show Gist options
  • Select an option

  • Save jaypeche/08108df8c8b0236e351e24c6412066ef to your computer and use it in GitHub Desktop.

Select an option

Save jaypeche/08108df8c8b0236e351e24c6412066ef to your computer and use it in GitHub Desktop.
ollama-bin-0.12.9 gentoo compile
These are the packages that would be merged, in order:
Calculating dependencies
* IMPORTANT: 1 news items need reading for repository 'gentoo'.
* Use eselect news read to view new items.
... done!
Dependency resolution took 3.51 s (backtrack: 0/20).
[ebuild R ~] sci-ml/ollama-bin-0.12.9::pingwho-overlay USE="cuda systemd -rocm" 0 KiB
Total: 1 package (1 reinstall), Size of downloads: 0 KiB
>>> Verifying ebuild manifests
>>> Running pre-merge checks for sci-ml/ollama-bin-0.12.9
>>> Emerging (1 of 1) sci-ml/ollama-bin-0.12.9::pingwho-overlay
* ollama-bin-amd64-0.12.9.tgz BLAKE2B SHA512 size ;-) ... [ ok ]
* Checking for at least 4 GiB disk space at "/var/tmp/notmpfs/portage/sci-ml/ollama-bin-0.12.9/temp" ...
 [ ok ]
>>> Unpacking source...
>>> Unpacking 'ollama-bin-amd64-0.12.9.tgz' to /var/tmp/notmpfs/portage/sci-ml/ollama-bin-0.12.9/work
>>> Source unpacked in /var/tmp/notmpfs/portage/sci-ml/ollama-bin-0.12.9/work
>>> Preparing source in /var/tmp/notmpfs/portage/sci-ml/ollama-bin-0.12.9/work ...
>>> Source prepared.
>>> Configuring source in /var/tmp/notmpfs/portage/sci-ml/ollama-bin-0.12.9/work ...
>>> Source configured.
>>> Compiling source in /var/tmp/notmpfs/portage/sci-ml/ollama-bin-0.12.9/work ...
>>> Source compiled.
>>> Test phase [not enabled]: sci-ml/ollama-bin-0.12.9
>>> Install sci-ml/ollama-bin-0.12.9 into /var/tmp/notmpfs/portage/sci-ml/ollama-bin-0.12.9/image
*
* INFO: Models and checksums saved into /opt/ollama-bin/.ollama are preserved...
*
>>> Completed installing sci-ml/ollama-bin-0.12.9 into /var/tmp/notmpfs/portage/sci-ml/ollama-bin-0.12.9/image
* Final size of build directory: 3175864 KiB (3.0 GiB)
* Final size of installed tree: 3175888 KiB (3.0 GiB)
strip: x86_64-pc-linux-gnu-strip --strip-unneeded -N __gentoo_check_ldflags__ -R .comment -R .GCC.command.line -R .note.gnu.gold-version
/opt/ollama-bin/lib/ollama/libggml-cpu-sandybridge.so
/opt/ollama-bin/lib/ollama/libggml-cpu-sse42.so
/opt/ollama-bin/lib/ollama/libggml-cpu-x64.so
/opt/ollama-bin/lib/ollama/libggml-base.so
/opt/ollama-bin/lib/ollama/libggml-cpu-haswell.so
/opt/ollama-bin/lib/ollama/libggml-cpu-icelake.so
/opt/ollama-bin/lib/ollama/libggml-cpu-skylakex.so
/opt/ollama-bin/lib/ollama/libggml-cpu-alderlake.so
/opt/ollama-bin/lib/ollama/cuda_v13/libcudart.so.13.0.96
/opt/ollama-bin/lib/ollama/cuda_v13/libcublasLt.so.13.1.0.3
/opt/ollama-bin/lib/ollama/cuda_v13/libggml-cuda.so
/opt/ollama-bin/lib/ollama/cuda_v13/libcublas.so.13.1.0.3
/opt/ollama-bin/lib/ollama/cuda_v12/libcublasLt.so.12.8.4.1
/opt/ollama-bin/lib/ollama/cuda_v12/libcublas.so.12.8.4.1
/opt/ollama-bin/lib/ollama/cuda_v12/libggml-cuda.so
/opt/ollama-bin/lib/ollama/cuda_v12/libcudart.so.12.8.90
/opt/ollama-bin/bin/ollama
>>> Installing (1 of 1) sci-ml/ollama-bin-0.12.9::pingwho-overlay
* checking 25 files for package collisions
>>> Merging sci-ml/ollama-bin-0.12.9 to /
--- /var/
--- /var/log/
--- /var/log/ollama/
=== /var/log/ollama/.keep_sci-ml_ollama-bin-0
--- /opt/
--- /opt/ollama-bin/
--- /opt/ollama-bin/bin/
=== /opt/ollama-bin/bin/ollama
--- /opt/ollama-bin/lib/
--- /opt/ollama-bin/lib/ollama/
=== /opt/ollama-bin/lib/ollama/libggml-cpu-alderlake.so
=== /opt/ollama-bin/lib/ollama/libggml-cpu-skylakex.so
=== /opt/ollama-bin/lib/ollama/libggml-cpu-icelake.so
=== /opt/ollama-bin/lib/ollama/libggml-cpu-haswell.so
=== /opt/ollama-bin/lib/ollama/libggml-base.so
=== /opt/ollama-bin/lib/ollama/libggml-cpu-sandybridge.so
=== /opt/ollama-bin/lib/ollama/libggml-cpu-x64.so
=== /opt/ollama-bin/lib/ollama/libggml-cpu-sse42.so
--- /opt/ollama-bin/lib/ollama/cuda_v12/
=== /opt/ollama-bin/lib/ollama/cuda_v12/libcudart.so.12.8.90
>>> /opt/ollama-bin/lib/ollama/cuda_v12/libcudart.so.12 -> libcudart.so.12.8.90
=== /opt/ollama-bin/lib/ollama/cuda_v12/libggml-cuda.so
>>> /opt/ollama-bin/lib/ollama/cuda_v12/libcublasLt.so.12 -> libcublasLt.so.12.8.4.1
=== /opt/ollama-bin/lib/ollama/cuda_v12/libcublasLt.so.12.8.4.1
>>> /opt/ollama-bin/lib/ollama/cuda_v12/libcublas.so.12 -> libcublas.so.12.8.4.1
=== /opt/ollama-bin/lib/ollama/cuda_v12/libcublas.so.12.8.4.1
--- /opt/ollama-bin/lib/ollama/cuda_v13/
=== /opt/ollama-bin/lib/ollama/cuda_v13/libcublas.so.13.1.0.3
>>> /opt/ollama-bin/lib/ollama/cuda_v13/libcudart.so.13 -> libcudart.so.13.0.96
=== /opt/ollama-bin/lib/ollama/cuda_v13/libggml-cuda.so
>>> /opt/ollama-bin/lib/ollama/cuda_v13/libcublasLt.so.13 -> libcublasLt.so.13.1.0.3
=== /opt/ollama-bin/lib/ollama/cuda_v13/libcudart.so.13.0.96
>>> /opt/ollama-bin/lib/ollama/cuda_v13/libcublas.so.13 -> libcublas.so.13.1.0.3
=== /opt/ollama-bin/lib/ollama/cuda_v13/libcublasLt.so.13.1.0.3
--- /usr/
--- /usr/bin/
>>> /usr/bin/ollama -> ../../opt/ollama-bin/bin/ollama
--- /usr/lib/
--- /usr/lib/systemd/
--- /usr/lib/systemd/system/
=== /usr/lib/systemd/system/ollama.service
>>> Safely unmerging already-installed instance...
--- replaced obj /var/log/ollama/.keep_sci-ml_ollama-bin-0
--- replaced dir /var/log/ollama
--- replaced dir /var/log
--- replaced dir /var
--- replaced obj /usr/lib/systemd/system/ollama.service
--- replaced dir /usr/lib/systemd/system
--- replaced dir /usr/lib/systemd
--- replaced dir /usr/lib
--- replaced sym /usr/bin/ollama
--- replaced dir /usr/bin
--- replaced dir /usr
--- replaced obj /opt/ollama-bin/lib/ollama/libggml-cpu-x64.so
--- replaced obj /opt/ollama-bin/lib/ollama/libggml-cpu-sse42.so
--- replaced obj /opt/ollama-bin/lib/ollama/libggml-cpu-skylakex.so
--- replaced obj /opt/ollama-bin/lib/ollama/libggml-cpu-sandybridge.so
--- replaced obj /opt/ollama-bin/lib/ollama/libggml-cpu-icelake.so
--- replaced obj /opt/ollama-bin/lib/ollama/libggml-cpu-haswell.so
--- replaced obj /opt/ollama-bin/lib/ollama/libggml-cpu-alderlake.so
--- replaced obj /opt/ollama-bin/lib/ollama/libggml-base.so
--- replaced obj /opt/ollama-bin/lib/ollama/cuda_v13/libggml-cuda.so
--- replaced obj /opt/ollama-bin/lib/ollama/cuda_v13/libcudart.so.13.0.96
--- replaced sym /opt/ollama-bin/lib/ollama/cuda_v13/libcudart.so.13
--- replaced obj /opt/ollama-bin/lib/ollama/cuda_v13/libcublasLt.so.13.1.0.3
--- replaced sym /opt/ollama-bin/lib/ollama/cuda_v13/libcublasLt.so.13
--- replaced obj /opt/ollama-bin/lib/ollama/cuda_v13/libcublas.so.13.1.0.3
--- replaced sym /opt/ollama-bin/lib/ollama/cuda_v13/libcublas.so.13
--- replaced dir /opt/ollama-bin/lib/ollama/cuda_v13
--- replaced obj /opt/ollama-bin/lib/ollama/cuda_v12/libggml-cuda.so
--- replaced obj /opt/ollama-bin/lib/ollama/cuda_v12/libcudart.so.12.8.90
--- replaced sym /opt/ollama-bin/lib/ollama/cuda_v12/libcudart.so.12
--- replaced obj /opt/ollama-bin/lib/ollama/cuda_v12/libcublasLt.so.12.8.4.1
--- replaced sym /opt/ollama-bin/lib/ollama/cuda_v12/libcublasLt.so.12
--- replaced obj /opt/ollama-bin/lib/ollama/cuda_v12/libcublas.so.12.8.4.1
--- replaced sym /opt/ollama-bin/lib/ollama/cuda_v12/libcublas.so.12
--- replaced dir /opt/ollama-bin/lib/ollama/cuda_v12
--- replaced dir /opt/ollama-bin/lib/ollama
--- replaced dir /opt/ollama-bin/lib
--- replaced obj /opt/ollama-bin/bin/ollama
--- replaced dir /opt/ollama-bin/bin
--- replaced dir /opt/ollama-bin
--- replaced dir /opt
>>> Original instance of package unmerged safely.
*
* Quick guide:
*
* Please, add your_user to ollama group,
* # usermod -a -G ollama your_user
*
* # ollama serve (standalone,systemd,openrc)
* $ ollama run llama3:3b (client)
*
* Browse available models at: https://ollama.com/library/
*
>>> sci-ml/ollama-bin-0.12.9 merged.
>>> Completed (1 of 1) sci-ml/ollama-bin-0.12.9::pingwho-overlay
* Messages for package sci-ml/ollama-bin-0.12.9:
* INFO: Models and checksums saved into /opt/ollama-bin/.ollama are preserved...
>>> Auto-cleaning packages...
>>> No outdated packages were found on your system.
* GNU info directory index is up-to-date.
* IMPORTANT: 1 news items need reading for repository 'gentoo'.
* Use eselect news read to view new items.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment