Lastly, as I said earlier, skill chips will still need to be able to override all kinds of reflexive and involuntary movement originating in the brain that would conflict with the chip's input. It seems unlikely that there's a way to do that which wouldn't also let the chip override voluntary movement if it's not programmed to avoid doing that, programming which can be changed.
So in the strictest sense, yes, it could be said that a skill chip must override natural behavior, because as you note if you didn't have the skill chip you'd be doing something different.
However, it can be said that all trained reactions "override" - they initiate before you have a chance to think more about it, "my body just moved like I trained," etc. A feature you could say the brain is meant to have. And a feature where, if once you
do think, if you feel the reaction is inappropriate, the brain can stop it. Halt the strike, pull the punch, abort the jump. And if it happens enough, the 'triggers' are revised to be less sensitive.
This is a behavior that skill chips must also have. Otherwise, you'd constantly be crushing your coworkers throats when they startle you.
More importantly, from a security perspective, it is a behavior that must be
enforced. It would be implemented in the chip socket, not in the skill shard-- the "operating system" versus the "app." At a software and hardware level, the system would be designed, all the way down, to prevent the chip sockets from being used for doll chip like override. Just as phone OSes make every effort to prevent apps from achieving certain things, even if the phone itself or manufacturer-approved apps can.
And then there are things doll chips need that skill chips never do. Shutting off consciousness, suppressing memories, direct control of the body by software completely circumventing the brain, etc. And since people don't even want skill chips to do those things, those abilities won't be available not only for security reasons but because overengineering is bad engineering.
Speaking of software-- it cannot be assumed that skill chips are software. After all, they are meant to be like memory, information or action. Real memories don't need executables to work. So the content of skill chips could very easily be formatted data, meant to be run in programming already in the socket or neuralware. Making doll chip like automation impossible for yet another reason.
Of course, this stuff doesn't matter in isolation. It matters only when paired with the way the technologies are percieved. People who would want to use skill chips are often people who would never tolerate doll chips for both social and security reasons. Therefore, be it differences in implementation or software, skill chips
must be unable to puppet people the way doll chips can, or the most valuable customers would never use them.