Community focused on world news with ai summaries to allowing you to stay up to date without as heigh of a time investment.
Check out [email protected]
I’m welcoming to all political opinions and viewpoints (just engage in good faith and don’t call for actionable violence) i try to avoid taking mod action against users unless absolutely nessasary.
I’m seeing a lot of new users coming from reddit. I would if you are new and want a place to keep up to date and express ur views in a place that won’t censor you please join.
I use a finetuned T5 summarisation model that is relatively accurate. It has some minor issues with occasional miss assigning quotes but it doesn’t hallucinate like a traditional GPT style model does. It is 60% identical to that of a human summary and >95% accurate in terms of meaning. It is more accurate than traditional nonai based summarisstion tools (I’m not sure how it compares to a human) but I belive it is as accurate and nonbias as possible.
Its biggest flaw is actually the traditional nonai web scraper which sometimes pulls the wrong content. Its all foss so if u wanna go make a pull to improve it that would be greatly appreciated.
EDIT: I’ve been experimenting with having a tradition GPT LLM look over the summary and original to catch these errors but have had little to no success without using large models which I cannot run on my local hardware (I unfortunately can’t afford to pay for inference at the scale my bot runs).
Thanks for the explanation. I think if you combined that with a method to retract or edit summaries based on human reports, you can probably fill in the remaining 5%. I am unsure how feasible that would be though. Good luck with the community!
Yeah I’m not sure how that can be achieved in a way where I single report can catch errors without letting every single user mess with it. I could perhaps expose the section breakdown to users and allow users to regenerate specific sections but that would require a lot more complex interaction. But thanks for the suggestion tho I’ll look into it.