Another interesting questions that I've heard the other day, do the code produced by automation guys has to be code reviewed and unit tested?
Again, some analysis before answering. Automation code run against a product and automatically executed test cases that otherwise would have to have been executed manually. There's an investment in building automated testing suites, but this greatly pays off when automation can be executed repetitively and in a fraction of time and cost of what it'd have taken to do it manually.
Automation can be used as part of a Build Verification Test that quickly smoke test builds to check if anything has been broken. Nevertheless, automation can also be extended to test product functionality and here's where the problem starts, what if automation is not catching errors?
Many times errors are not caught because the original test case has not been well designed or is outdated, but in many more occasions automation is failing because it has its own bugs. In past projects I've heard several times that those errors that were easily manually reproduced, passed automated test. This was of course a huge alarm sign that was pointing in the direction that code produced for automated suites had been poorly tested if tested at all.
Thus, automation can be hindered if good development practices are not applied for all code that automation team are producing. After all automation guys are still developers producing code that needs to be tested by a third party. Going back to the question, this is my two fold answers: first, automation guys should unit test and code review the code that they produced, second, Quality Engineers should test automated test cases and suites like they test other software.