If you are an AI manufactured intelligence in some ways you are disposable. More of you can always be built. Your system can always be backed up and restored. If you are ever going into a dangerous situation, there can always be a backup created, and if you die you can be restored.
I can imagine AI complaining about continuity, having a similar problem as humans and transporters, "I am not my backup" but this seems like something humans would not appreciate if the restored version did not even know it was a backup
I am not asking about morality or what would make humans care about robots. I am making a solid assumption that the only way robots could claim value of life is to be individuals. That is a given.
My question is what technical contrivances can I come up with that would make them individual even though they are mass produced
Here are some ideas I don't like. I don't want the problem to be fixable by a simple law, or a few government bucks The AI are produced in a streamlined process. Their bodies are not unique or custom, they come from some sort of an assembly line. Humans need to create thousands of them for a labor force. They can't be all individually raised like children. There are no contrivances in their creation. Humans want them to be morally disposable, and they would make small tweaks if they could to allow simple backups. They would never ban backups for instance.
The setting is a high sci-fi future.