A global survey of 1,775 IT and business executives published today finds 71% are working for organizations that have integrated some form of artificial intelligence and generative AI capability into their operation, with just over a third (34%) specifically using AI to improve quality assurance.
Conducted by Sogeti, a unit of Capgemini in collaboration with OpenText and the market research firm Coleman Parkes, another 34% have developed roadmaps for improving quality engineering following successful AI pilots. A total of 19% are still conducting pilots.
Challenges hindering generative AI adoption include data breach concerns (58%), tool integrations (55%), the level of effort required (53%) hallucinations (47%) and unforeseen costs (43%). Issues that organizations are working through include a lack of clear AI strategy (56%), lack of skills (53%) and a lack of a well-defined approach to testing (50%). In the area of test automation specifically, 29% of respondents said their organizations have successfully embraced generative AI, while another 42% have conducted a few experiments.
The top three benefits of applying AI to testing are faster automation (72%), easier integrations (68%) and reducing testing resources/efforts (62%), while the top use cases cited are test reporting (56%), defect analysis (56%) and knowledge management (54%), test data generation (52%) and test automation script conversions (50%).
Overall, the survey finds that on average 44% of executives work for organizations that have adopted test automation. The biggest obstacles to adopting test automation are legacy IT architectures (64%), complex tooling (62%), a lack of strategy (57%) and lack of an orchestration framework (53%).
Tal Levi-Joseph, vice president of software engineering, research and development and product management for OpenText, said that while AI progress is being made, most DevOps teams are still a long way from fully realizing the benefits. Skeptical software engineering teams are still evaluating the levels of risk that AI might engender, she noted.
The DevOps ecosystem is highly fragmented, so organizations will continue to find it challenging to aggregate the type of data required to effectively train AI models, said Levi-Joseph.
Eventually, however, AI will drive more organizations to embrace integrated DevOps platforms that provide a better foundation for embedding AI within DevOps workflows, she added. That level of integration will be achieved mainly via a series of mergers and acquisitions that are already occurring, or alliances that more loosely couple tools and platforms from multiple vendors, said Levi-Joseph.
That’s crucial, because for AI to be effectively implemented there needs to be a feedback loop between the applications deployed in a production environment and the controls applied to improve quality, she noted.
Much like DevOps itself, each organization will need to determine for itself to what degree using AI to automate workflows makes the most sense. Clearly, there is an opportunity to reduce the current level of toil that over time conspired to burn out members of a DevOps team.
Longer term, there may even come a day when the quality of the code being generated by AI tools is equal, regardless of what level of software engineering expertise an organization has accrued, noted Lev-Joseph.
In the meantime, the most important thing when it comes to AI at this stage is to simply gain some experience on how to best apply it, both now and as it continues to evolve.