The quantified self idea refers to “self-knowledge through numbers” and has the goal of improving physical and mental fitness through a better understanding of how environment and habits influence quality of life.

This idea is about finding a good data model to represent self-quantified metrics (i.e. what is relevant to collect) and also exploring integration with external devices (e.g. fitbit / garmin) wearable devices.

Initial tech stack could include a PostgreSQL database and Grafana as the UI.

Looking for hackers with the skills:

monitoring grafana

This project is part of:

Hack Week 19

Activity

  • almost 5 years ago: MMoese liked this project.
  • almost 5 years ago: mlin7442 liked this project.
  • almost 5 years ago: jcavalheiro added keyword "monitoring" to this project.
  • almost 5 years ago: jcavalheiro added keyword "grafana" to this project.
  • almost 5 years ago: jcavalheiro originated this project.

  • Comments

    Be the first to comment!

    Similar Projects

    Saline (state deployment control and monitoring tool for SUSE Manager/Uyuni) by vizhestkov

    Project Description

    Saline is an addition for salt used in SUSE Manager/Uyuni aimed to provide better control and visibility for states deploymend in the large scale environments.

    In current state the published version can be used only as a Prometheus exporter and missing some of the key features implemented in PoC (not published). Now it can provide metrics related to salt events and state apply process on the minions. But there is no control on this process implemented yet.

    Continue with implementation of the missing features and improve the existing implementation:

    • authentication (need to decide how it should be/or not related to salt auth)

    • web service providing the control of states deployment

    Goal for this Hackweek

    • Implement missing key features

    • Implement the tool for state deployment control with CLI

    Resources

    https://github.com/openSUSE/saline


    Update my own python audio and video time-lapse and motion capture apps and publish by dmair

    Project Description

    Many years ago, in my own time, I wrote a Qt python application to periodically capture frames from a V4L2 video device (e.g. a webcam) and used it to create daily weather timelapse videos from windows at my home. I have maintained it at home in my own time and this year have added motion detection making it a functional video security tool but with no guarantees. I also wrote a linux audio monitoring app in python using Qt in my own time that captures live signal strength along with 24 hour history of audio signal level/range and audio spectrum. I recently added background noise filtering to the app. In due course I aim to include voice detection, currently I'm assuming via Google's public audio interface. Neither of these is a professional home security app but between them they permit a user to freely monitor video and audio data from a home in a manageable way. Both projects are on github but out-of-date with personal work, I would like to organize and update the github versions of these projects.

    Goal for this Hackweek

    It would probably help to migrate all the v4l2py module based video code to linuxpy.video based code and that looks like a re-write of large areas of the video code. It would also be good to remove a lot of python lint that is several years old to improve the projects with the main goal being to push the recent changes with better organized code to github. If there is enough time I'd like to take the in-line Qt QSettings persistent state code used per-app and write a python class that encapsulates the Qt QSettings class in a value_of(name)/name=value manner for shared use in projects so that persistent state can be accessed read or write anywhere within the apps using a simple interface.

    Resources

    I'm not specifically looking for help but welcome other input.