Tldr: Theyre adding an opt-in alt text generation for blind people and an opt-in ai chat sidebar where you can choose the model used (includes self-hosted ones)
Tldr: Theyre adding an opt-in alt text generation for blind people and an opt-in ai chat sidebar where you can choose the model used (includes self-hosted ones)
A lot of people use llms a lot, ao its useful for them, but its also nice for summarizing long articles you dont have the time to read, not as good as reading it, but better than skimming jt
@Blisterexe @Xuderis It’s true, as a researcher, these models have helped me a lot to speed up the process of reading and identifying specific information in scientific articles. As long as it is privacy respecting, I see this implementation with good eyes.
It lets you use any model, so while it lets you use chatgpt, it also lets you use a self-hosted model if you edit about:config
But what does using that in my browser get me? If I’m running llama2, I can already copy and paste text into the terminal if I want. Is this just saving me that step?