DSTC 2016 Day 1 – Test Automation Contest

DSTC organized a Test Automation Contest today at Radisson Blu, Dwarka, New Delhi. It was a well conducted event with teams from multiple cities flexing their Automation muscles.

Out of 30+ teams, we at S&P Global Market Intelligence were amongst the top 7 to be selected for this contest.We were quite excited about the event (so much that we were the ones who opened the gates of the Ballroom !). With all guns blazing, we started our quest to show our mettle.

It began with the judges explaining about the parameters on which we will be evaluated(like code quality, tool usage, team collaboration etc) and we were given some scenarios related to Google Search and GMail to be automated. What seemed to be a stroll in the park initially became a roller-coaster ride with lot of brainstorming, collaboration, code breakages, integration, network issues and what not !

After 2 hours of coding and a sumptuous lunch, all teams were ready to showcase their work. Still not out of steam, we decided to go in first. Here are some of the highlights of what the teams presented(in order):-

  1. S&P Global Market Intelligence (Gurgaon) – We showcased end to end running Java + Selenium code which was triggered from a localized setup of Jenkins. The given tasks were automated by parameterizing and data driving our tests. Framework features like Multi-threading, Cross Browser/Platform compatibility, Page Object Model, Dependency Management through Maven were showcased. We were quite surprised with the eye for detail that judges had and got some great inputs from them which we would like to work upon.
  2. Perficient (Nagpur) – This team showed a Behavior Driven approach to solving the given problem using Selenium, Java, Cucumber, ANT and Maven. I liked the overall presentation by them as BDD is one quintessential part of Agile and their framework seemed to be a robust one. Although the judges said that the given problem is not a good candidate for Agile, I was still hopeful that they might be one of the contenders for the title but the nail in the coffin was that their code was not able to execute.
  3.  Tech Mahindra (Hyderabad) – This was a team which had experience on their side(most of them being 15+ years experienced). Inspite of the judges saying that QTP is a silly tool, this team was quite enthusiastic about QTP and I think for the given problem at hand, QTP was not that bad a tool choice. They had a good UFT based framework which also had support for Mobile apps. There were some heated discussions on some of the approaches used by them on which I think I’m with the team and not the judges 🙂
  4.  Tavant Technologies (Noida) – People from Tavant wooed the judges with their almost too good to be true framework which was called FUSEM(Framework for Unified Scripting and Execution of MoWeb Apps). This was a Framework which was based in a Web App as well as Mobile App and users could use both entry points to create Features, Run scripts, Reporting etc. I think they stole the show as far as framework was related but I would have liked to see some technical expertise from them rather than Framework being the center point of their presentation.
  5. Aurigo Technologies (Bangalore) – 2 guys casually walked to the room and showed a simple yet powerful C#, NUnit and Selenium based framework. They seemed to have got the checklist right in terms of all the best practices of Automation but I think being just 2 in the team was a disadvantage to them as they were not able to complete even one task fully.
  6. Info Edge (Noida) – These people from Noida showcased a stellar Page Object Model based framework which used Excel to store the test cases and TestNG for running them. They were good with the coding conventions and displayed how we can run the cases from the command line. Overall, it was a good show by this team.
  7. MindFire Solutions (Bhubaneshwar) – This team had an Excel centric Framework which used Gradle for build automation and had some fancy features like creation of Video, Saucelabs integration, Jenkins compatibility etc. They were advocates of the fact that we need to simplify things for Manual QA to adopt automation but their approach of having so many excel files for different things was a let-down for me as I thought its actually making the process more complex.

Some of the key takeaways from all the awesome things I heard the whole day are:-

  1. Page Objects are a great way of maintaining your code.
  2. Naming conventions are extremely important for code quality.
  3. Stay away from hard coded waits as often as possible.
  4. BDD is great but it is not necessary that it fits the requirements of all projects.
  5. Compatibility with a build tool like Jenkins is a must have for an Automation Framework.
  6. Try to move away from Excel based frameworks as they make Continuous Integration a bit difficult.
  7. Use implicit and explicit waits judiciously as a combination of them can be harmful to the script.
  8. Try to have your identifiers/locators in an external file so that you don’t need to build your code if there are any changes to the application.

Edit: Info Edge won the competition.

Happy Automating,

Harshit Kohli