I was working in my (poor third world) government job, and our keyboard broke. Replacements took months, since they only bought mouse and keyboards in bulk once per year or so, and they ran out of.
I had a second job working as a contractor for a private company, where we were contracted for a public hospital providing system administration and technical support. We had some old PS2 keyboards that were to be decommissioned, but since they didn’t have inventory number, I got hold of them and brought some to my other job.
So I donated some equipment from one area of government to another, but it was kinda illegal, lol 😆.
I was thinking… What if we do manage to make the AI as intelligent as a human, but we can’t make it better than that? Then, the human intelligence AI will not be able to make itself better, since it has human intelligence and humans can’t make it better either.
Another thought would be, what if making AI better is exponentially harder each time. So it would be impossible to get better at some point, since there wouldn’t be enough resources in a finite planet.
Or if it takes super-human intelligence to make human-intelligence AI. So the singularity would be impossible there, too.
I don’t think we will see the singularity, at least in our lifetime.