Gotta say I do find it funny that it’s hosted on GitHub. I’d think to really go against CoPilot, it needs to move to another platform.
Yeah. This has a “Borders hosting their eCommerce on AWS” vibe.
I don’t see any indication of AI integration on Codeberg, and in fact one of the features of the project is that it is lightweight and simple to set-up. Unsure if integrating potentially resource intensive or complex AI aligns with those properties.
I think an LLM integrated into the IDE would be better suited when it comes to projects that aren’t backed by a company like Microsoft who have a large amount of GPU compute to spare for their users.
Or it’d be bart of a CI pipeline. AFAIK that is theoretically already possible. You could configure the existing CI to feed the code through some form of AI code check.
I think Codeberg/Forgejo is trying to get federation working first (as they should). AI should hopefully come later (if at all).
Is there a progress tracker for codeberg’s federation? I’d like to keep up with that
I’m aiming to get a gitlab install running with the experimental option of ActivityPub support and I would love to have that work with codebergs’
Does gitlab already have experimental ActivityPub? 😮
Here’s the ticket that tracks progress for federation development of Forgejo. Codeberg probably deploys fairly frequently.
Yeah, unfortunately limited to self hosted installs for now, but here’s the details for it
That was quick 😮 If they manage to deliver creating pull requests (meaning forking from another instance and sending a pull request back), that would be huge. I’d honestly consider self-hosting.
This is actually planned, which is what intrigued me initially
turns out i linked the wrong page initially, here’s the page that fully describes the entire activitypub implementation they’re planning
https://docs.gitlab.com/ee/architecture/blueprints/activity_pub/index.html
Seems inevitable, but could take awhile.
I’m sorry that this in no way answers your question, but… does anyone know if the LMS protocol is sufficiently rich enough to support AI co-piloting? It seems like it should, as it already supports extended autocompletes such as loop templating, but I wonder if a copilot would tax it.
That’s how I’d hope it (AI co-coding) would arrive: not having to be custom baked into each editor, but hooked into a standard usable by even simple (non-IDE) editors.