You need to sign in or sign up before continuing.

Project Description

The goal is to have a language model, that is able to answer technical questions on Uyuni. Uyuni documentation is too large for in-context processing, so finetuning is the way to go.

Goal for this Hackweek

Finetune a model based on llama-2-7b.

Resources

github repo

Looking for hackers with the skills:

ai uyuni

This project is part of:

Hack Week 23

Activity

  • over 1 year ago: nadvornik added keyword "ai" to this project.
  • over 1 year ago: nadvornik added keyword "uyuni" to this project.
  • over 1 year ago: nadvornik originated this project.

  • Comments

    Be the first to comment!

    Similar Projects

    Use AI tools to convert legacy perl scripts to bash by nadvornik

    Description

    Use AI tools to convert legacy perl scripts to bash

    Goals

    Uyuni project contains legacy perl scripts used for setup. The perl dependency could be removed, to reduce the container size. The goal of this project is to research use of AI tools for this task.

    Resources

    Aider

    Results:

    Aider is not the right tool for this. It works ok for small changes, but not for complete rewrite from one language to another.

    I got better results with direct API use from script.


    Learn how to integrate Elixir and Phoenix Liveview with LLMs by ninopaparo

    Description

    Learn how to integrate Elixir and Phoenix Liveview with LLMs by building an application that can provide answers to user queries based on a corpus of custom-trained data.

    Goals

    Develop an Elixir application via the Phoenix framework that:

    • Employs Retrieval Augmented Generation (RAG) techniques
    • Supports the integration and utilization of various Large Language Models (LLMs).
    • Is designed with extensibility and adaptability in mind to accommodate future enhancements and modifications.

    Resources

    • https://elixir-lang.org/
    • https://www.phoenixframework.org/
    • https://github.com/elixir-nx/bumblebee
    • https://ollama.com/


    ghostwrAIter - a local AI assisted tool for helping with support cases by paolodepa

    Description

    This project is meant to fight the loneliness of the support team members, providing them an AI assistant (hopefully) capable of scraping supportconfigs in a RAG fashion, trying to answer specific questions.

    Goals

    • Setup an Ollama backend, spinning one (or more??) code-focused LLMs selected by license, performance and quality of the results between:
    • Setup a Web UI for it, choosing an easily extensible and customizable option between:
    • Extend the solution in order to be able to:
      • Add ZIU/Concord shared folders to its RAG context
      • Add BZ cases, splitted in comments to its RAG context
        • A plus would be to login using the IDP portal to ghostwrAIter itself and use the same credentials to query BZ
      • Add specific packages picking them from IBS repos
        • A plus would be to login using the IDP portal to ghostwrAIter itself and use the same credentials to query IBS
        • A plus would be to desume the packages of interest and the right channel and version to be picked from the added BZ cases


    Automated Test Report reviewer by oscar-barrios

    Description

    In SUMA/Uyuni team we spend a lot of time reviewing test reports, analyzing each of the test cases failing, checking if the test is a flaky test, checking logs, etc.

    Goals

    Speed up the review by automating some parts through AI, in a way that we can consume some summary of that report that could be meaningful for the reviewer.

    Resources

    No idea about the resources yet, but we will make use of:

    • HTML/JSON Report (text + screenshots)
    • The Test Suite Status GithHub board (via API)
    • The environment tested (via SSH)
    • The test framework code (via files)


    AI for product management by a_jaeger

    Description

    Learn about AI and how it can help myself

    What are the jobs that a PM does where AI can help - and how?

    Goals

    • Investigate how AI can help with different tasks
    • Check out different AI tools, which one is best for which job
    • Summarize learning

    Resources

    • Reading some blog posts by PMs that looked into it
    • Popular and less popular AI tools

    Work is done SUSE internally at https://confluence.suse.com/display/~a_jaeger/Hackweek+25+-+AI+for+a+PM and subpages.


    Enable the containerized Uyuni server to run on different host OS by j_renner

    Description

    The Uyuni server is provided as a container, but we still require it to run on Leap Micro? This is not how people expect to use containerized applications, so it would be great if we tested other host OSs and enabled them by providing builds of necessary tools for (e.g. mgradm). Interesting candidates should be:

    • openSUSE Leap
    • Cent OS 7
    • Ubuntu
    • ???

    Goals

    Make it really easy for anyone to run the Uyuni containerized server on whatever OS they want (with support for containers of course).


    Create SUSE Manager users from ldap/ad groups by mbrookhuis

    Description

    This tool is used to create users in SUSE Manager Server based on LDAP/AD groups. For each LDAP/AD group a role within SUSE Manager Server is defined. Also, the tool will check if existing users still have the role they should have, and, if not, it will be corrected. The same for if a user is disabled, it will be enabled again. If a users is not present in the LDAP/AD groups anymore, it will be disabled or deleted, depending on the configuration.

    The code is written for Python 3.6 (the default with SLES15.x), but will also work with newer versions. And works against SUSE Manger 4.3 and 5.x

    Goals

    Create a python and/or golang utility that will manage users in SUSE Manager based on LDAP/AD group-membership. In a configuration file is defined which roles the members of a group will get.

    Table of contents

    Installation

    To install this project, perform the following steps:

    • Be sure that python 3.6 is installed and also the module python3-PyYAML. Also the ldap3 module is needed:

    bash zypper in python3 python3-PyYAML pip install yaml

    • On the server or PC, where it should run, create a directory. On linux, e.g. /opt/sm-ldap-users

    • Copy all the file to this directory.

    • Edit the configsm.yaml. All parameters should be entered. Tip: for the ldap information, the best would be to use the same as for SSSD.

    • Be sure that the file sm-ldap-users.py is executable. It would be good to change the owner to root:root and only root can read and execute:

    bash chmod 600 * chmod 700 sm-ldap-users.py chown root:root *

    Usage

    This is very simple. Once the configsm.yaml contains the correct information, executing the following will do the magic:

    bash /sm-ldap-users.py

    repository link

    https://github.com/mbrookhuis/sm-ldap-users


    Saltboot ability to deploy OEM images by oholecek

    Description

    Saltboot is a system deployment part of Uyuni. It is the mechanism behind deploying Kiwi built system images from central Uyuni server location.

    System image is when the image is only of one partition and does not contain whole disk image and deployment system has to take care of partitioning, fstab on top of integrity validation.

    However systems like Aeon, SUSE Linux Enterprise Micro and similar are distributed as disk images (also so called OEM images). Saltboot currently cannot deploy these systems.

    The main problem to saltboot is however that currently saltboot support is built into the image itself. This step is not desired when using OEM images.

    Goals

    Saltboot needs to be standalone and be able to deploy OEM images. Responsibility of saltboot would then shrink to selecting correct image, image integrity validation, deployment and boot to deployed system.

    Resources

    • Saltboot - https://github.com/uyuni-project/retail/tree/master
    • Uyuni - https://github.com/uyuni-project/uyuni


    Run local LLMs with Ollama and explore possible integrations with Uyuni by PSuarezHernandez

    Description

    Using Ollama you can easily run different LLM models in your local computer. This project is about exploring Ollama, testing different LLMs and try to fine tune them. Also, explore potential ways of integration with Uyuni.

    Goals

    • Explore Ollama
    • Test different models
    • Fine tuning
    • Explore possible integration in Uyuni

    Resources

    • https://ollama.com/
    • https://huggingface.co/
    • https://apeatling.com/articles/part-2-building-your-training-data-for-fine-tuning/


    Improve Development Environment on Uyuni by mbussolotto

    Description

    Currently create a dev environment on Uyuni might be complicated. The steps are:

    • add the correct repo
    • download packages
    • configure your IDE (checkstyle, format rules, sonarlint....)
    • setup debug environment
    • ...

    The current doc can be improved: some information are hard to be find out, some others are completely missing.

    Dev Container might solve this situation.

    Goals

    Uyuni development in no time:

    • using VSCode:
      • setting.json should contains all settings (for all languages in Uyuni, with all checkstyle rules etc...)
      • dev container should contains all dependencies
      • setup debug environment
    • implement a GitHub Workspace solution
    • re-write documentation

    Lots of pieces are already implemented: we need to connect them in a consistent solution.

    Resources

    • https://github.com/uyuni-project/uyuni/wiki