A lot of the load is made easier when you combine traditional AI (RL shit we have) with the Animal AI. A lot of the short-comings of an animal are easily handled with normal computer stuff. Like, we don't NEED a Dog AI, we simply need a AI that is good at identifing smells. What those smell mean, the AI doesn't have to know or care. It doesn't know that that smell is gunpowder, and as such that guy probably has a gun. All it knows is that, when it smells that smell, to mentally tick off this box (which happens to have "Gunpowder" written beside it).I'll do my best to be on the ball when explaining things but I also don't want to drown everyone in text (I do enough of that with every action vote.
The issue with a lot of these things is that while useful animal psychological behavior is not enough to get it to work on its own. The AI cannot make long term plans thus it cannot truly succeed at the stock market. It can recognize and emulate parts of speech but it is incapable of understanding the complexity of human speech (Parrots and dogs can understand and repeat things like "sit" but they can't grasp "sit down after eating") so chatbots and full on speech recognition are not at all feasible. Likewise self-driving cars (getting a car to head to a destination in a timely manner while not violating traffic laws is incredibly complex and would need too much stitching and cobbling to actually function with the AI you have now), medical diagnostics (how many diseases can an animal recognize?), customer support (language issues again) and even arguably emotional recognition (can animals understand sarcasm?) are too complicated for the AI to parse as it is. Yes some of the fields listed could work (aviation and agriculture would for example) but oftentimes they still need hardware rejected. The general rule of thumb is that if an animal cannot do it then the animal-like AI can't do it. Yes the behavior is useful but it isn't a complete enough package to truly get it functional. The write-in was rejected because it was way too broad and had massive holes in what you could actually do with the animal-like AI. A lot of the proposed fields have a complexity to the tasks that no animal could truly deal with so the AI would be incapable of properly fulfilling the task.
Long-term planning might not be a thing it can do, but it might not NEED to be able to do that. Ants prepare for the winter without having a firm idea of what "The Winter" is, they just recognize what is happening here and now, and have a behavior they respond with. No actual planning, but the behavior just so happens to set up them up in a beneficial situation later. This is hte most iffy of all, the stock market is complex, and we are probably honestly better off using normal AI, but hey, never know what we might figure out.
Speech Recognition is less an issue of understanding the message, but more exactly what is being said. If the Animal AI can discern "Sit down after eating", it can pass that off to other, normal AI to handle that. Understanding exactly what is being said is more problematic than the message behind it. A lot of bonus features can be added if we include the ability to know when IT is being talked to. So someone just saying the word "sit" in passing wouldn't get a response, but someone saying it, intentionally, to the AI would.
Chat Bots, birds have an amazing ability to detect patterns in conversation and repeat them. At appropriate times at that. And this is not to mention the various accounts of a bird seeming to say new stuff (I remember one if a bird who was always greeted with "You're a bird!", later saying to itself "I'm a bird"), or the fact that there is a gorilla who can speak (sign language).
Honestly, Self-Driving cars are pretty simple. The HARDEST part is object recognition. Figuring out what is a car, where it is going, and the like. If the roads were empty and you didn't have to worry about other people? It'd be a breeze to make a car that could get wherever it needed to and never broke the law. We only need the animal AI for that part, since "recognizing other moving thigns and where they are going" is a VERY basic trait of most any animal that has ever been eaten or eaten something.
A surprising number of diseases can be detected by dogs. Like various types of cancer. Something about how people smell. Teach the AI to recognize various types of smells, and there you go. I'm sure you could get a lot else by mixing in keen hearing and UV/IR vision.
Customer Support would be mainly with emotion recognition. While it may not be able to understand sarcasm, it honestly doesn't need to. Just needs to rate the person on the "Happy VS Angry" scale, and know when situations need to be escalated or not. You can be sarcastic all you want, but what matters isn't what your saying but how you're saying it. And angry person has a different tone of voice from a happy sarcastic one.
And yes, there are hardware limitations, but that is true of literally anything. Pure software will get you no where.