Current state
I'm currently setting up a Datascience/ML Environment for my research team. I use a Synology NAS as a reverse Proxy to forward requests to our workstations. These host several different Jupyter Labs for projects (located on the NAS), so we can distribute our workload on several machines, while modifying the same data source. This allows everyone to work on the same projects, while choosing the code executing machine via a subdomain (routed through the Reverse Proxy).
Issue
My current issue is, that I am quite restricted with my authentication options, since all labs are just password secured (+ SSL over Let's Encrypt certificates).
My solution idea
I'm currently investigating Jupyter Hub and its option for OAuth logins. Additionally I found, that it can use systemd to spawn Notebooks / Labs.
Can anyone tell me if it's possible to just integrate running Labs on different machines into the Hub ,that would run on the NAS (in a docker container)? Or provide any alternative that could do that?
My goal
My wish setup would be:
- option for different users to login on a central page with their unique credentials (preferably OAuth)
- then selecting their prefered project on their prefered machine
- option for admins to create / authorize new users and assign them access to the running labs / machines
I'm glad for any solutions / tips / ideas for improvement / security concerns.
Cheers!
from Can I use JupyterHub to authorise users' access to remote hosted lab instances?
No comments:
Post a Comment