The original post: /r/linux by /u/mika314 on 2024-11-20 08:08:30.
TL;DR: I built a Linux-compatible screen-casting tool for the Oculus Quest during last weekend using ChatGPT. It’s open-source, lightweight, and works directly in the browser. Check it out! 🎉
What It Does:
- Streams your desktop to your Quest using a browser.
- Works on Linux! 🙌
- Low latency, runs at ~60 FPS.
- No app installation required on the Quest—just open your browser and connect!
How I Built It:
With the help of ChatGPT, I managed to piece together the entire project in just a weekend.
- Tech Stack:
- C++ with Boost, FFmpeg, and OpenGL for capturing, encoding, and streaming.
- WebSocket server for transmitting video and audio.
- A lightweight web client that decodes and displays the video.
- Challenges:
- RGB to YUV conversion: ChatGPT helped me optimize this with AVX2. It was quite challenging, since neither me nor ChatGPT knew how to program SIMD 😛
- Keeping CPU usage low: Even with optimizations, the process saturates ~4 threads on my 8-core, hyper-threading CPU (16 fake threads).
Demo & Source Code:
I owe a huge thanks to ChatGPT for speeding up the process. Let me know if you have any feedback, or feel free to try it out and share your thoughts! 😊
You must log in or register to comment.