Your Optimized Web Testing Guide: Parallel Cypress with Docker, Jenkins, Report Portal, and Mochawesome

Mohammed Lareb Zafar

Optimize how you approach your web testing endeavors with a parallel Cypress test execution pipeline—orchestrated by Docker containers and seamlessly managed by Jenkins. This guide presents a robust setup and workflow for concurrent Cyprus test runs.

In addition, we will walk through the Integrate Report Portal for detailed reporting and show how to use Mochawesome for insightful test analysis. This meticulously designed pipeline operates across multiple virtual machines (Jenkins agents), substantially cutting down test suite execution time.

Decipher Your Jenkinsfile

Explore the core of your Jenkinsfile, where key components work together to facilitate parallel Cypress testing and seamless Report Portal integration.

Enable Parallel Test Stages

Use parallelism within your Jenkinsfile to execute Cypress tests across multiple virtual machines simultaneously. By assigning each machine a unique set of test specifications, you can significantly reduce the overall test suite execution time.

def parallelStages = [:]

void run_tests(specList, index) {
  // ... Your test execution logic here ...
}

pipeline {
  // ... Pipeline configuration ...

  stages {
    stage('Build Parallel Stage Map from TOTAL_SPECS') {
      steps {
        script {
          TOTAL_SPECS.eachWithIndex { specs, index ->
            parallelStages[specs] = {
              node(node_label) {
                stage(specs) {
                 // … Test execution and stashing logic …
                }
              }
            }
          }
          parallel parallelStages
        }
      }
    }

    // ... More stages ...
  }
}

Efficiently Test Execution Using Docker and Cypress

The run_tests function orchestrates Cypress test execution based on the specList parameter. Docker manages the testing environment and the dependencies, ensuring consistency. Test results and screenshots are “stashed” for subsequent analysis.

void run_tests(specList, index) {
  for (int i = 0; i < specList.size(); i++) {
    def specs = specList[i]
    sh '''
      #!/usr/bin/env bash
      set -exu
      
      # ... Test execution setup and Docker commands ...

      echo "Stage '''+index+''' . '''+i+''' Completed"
      '''
  }
}

Unstash Reports and Screenshots Effortlessly

After parallel test execution, reports and screenshots are collected from each Jenkins Agent, readying the gathered data for comprehensive aggregation and analysis.

stage('Unstash Reports & Screenshots') {
  steps {
      script {
        dir("test_dir") {
          sh "echo 'Starting Unstash'"

          // ... Unstashing logic ...

          sh "echo 'Unstash Completed'"
          sh "pwd; ls -la cypress/results/"
          sh "mkdir -p cypress/screenshots"
        }
      }
  }
}

package.json

{
  "name": "cypress-test",
  "version": "1.0.0",
  "scripts": {
    "cypress:open": "cypress open",
    "cypress:merge": "mochawesome-merge 'cypress/results/*.json' > cypress/fixtures/mochawesome.json",
    "cypress:marge": "marge cypress/fixtures/mochawesome.json -f mochawesome.html -o mochawesome_reports",
  },
  "devDependencies": {
    "@types/node": "^18.8.2",
    "mochawesome": "^7.1.3",
    "mochawesome-merge": "^4.2.1",
    "mochawesome-report-generator": "^6.2.0",
  },
  "peerDependencies": {
  },
  "dependencies": {
    "@aws-sdk/client-sns": "^3.80.0",
    "@testing-library/cypress": "^8.0.2",
    "cypress-multi-reporters": "^1.6.1",
    "npx": "^10.2.2",
    "superagent": "^7.1.1",
  }
}

Unified Report Merging for Comprehensive Insights

Combine the test reports generated by parallel Jenkins agents into a unified report using npm commands like cypress:merge and cypress:marge.

stage('Merge Reports') {
  steps {
      script {
        sh '''
            #!/usr/bin/env bash
            set -exu

            RELEASE=''
            
            docker run \
              -t \
              --entrypoint="/bin/sh" \
              -v "$WORKSPACE:/e2e" \
              -w /e2e \
              cypress/included:12.14.0 \
              -c "npm install --silent && npm run cypress:merge && npm run cypress:marge"
          '''
      }
  }
}

Seamless Report Portal Integration

After merging the reports, seamlessly transmit test results to Report Portal using a Python script. This integration facilitates comprehensive reporting and analysis.

stage('Send Results - Report Portal') {
  steps {
      script {
        sh '''
            #!/usr/bin/env bash
            set -exu

            python3 -m venv env
            . env/bin/activate
            pip install reportportal-client==5.2.5
            python3 cypress/fixtures/push_to_report_portal.py --result_file_path cypress/fixtures/mochawesome.json --launch_name uitest_'''+LAUNCH_NAME+''' || true
          '''
      }
  }
}

push_to_report_portal.py:

import json
import os
import logging
import argparse
from time import time

from reportportal_client import ReportPortalService

REPORTPORTAL_API_TOKEN = os.environ.get("REPORTPORTAL_API_TOKEN")
BUILD = os.environ.get("BUILD")
BUILD_URL = os.environ.get("BUILD_URL")

REPORTPORTAL_BASE_URL = "<base_url>"
REPORTPORTAL_PROJECT_NAME = "<project_name>"

REPORTPORTAL_LAUNCHID_FILE_NAME = "reportPortalUUID.txt"
screenshot_name_combinations = ["", " -- after all hook", " -- before each hook"]


def timestamp():
    return str(int(time() * 1000))


def post_result_to_reportportal(args):
    if not REPORTPORTAL_API_TOKEN:
        logging.error(
            "No ReportPortal API token in the environment, can't post results"
        )
        return False

    if not BUILD:
        logging.error("BUILD not set, skipping post to ReportPortal")
        return False

    launch_name = args.launch_name
    launch_doc = f"BUILD - {BUILD}\n Jenkins - {BUILD_URL}"

    service = ReportPortalService(
        endpoint=REPORTPORTAL_BASE_URL,
        project=REPORTPORTAL_PROJECT_NAME,
        token=REPORTPORTAL_API_TOKEN,
    )

    service.start_launch(
        name=launch_name,
        start_time=timestamp(),
        attributes=[
            {"key": "BUILD", "value": BUILD},
            {"key": "jenkins", "value": os.environ.get("BUILD_URL")},
            {"key": "Suite", "value": os.environ.get("TAG")},
        ],
    )

    with open(args.result_file_path) as f:
        data = json.load(f)
        results = data["results"]
        for result in results:
            dir_name = result["file"].split("/").pop()
            spec = result["fullFile"]
            spec_file_item_id = service.start_test_item(
                name=spec, start_time=timestamp(), item_type="TEST"
            )
            spec_failed = False
            for suite in result["suites"]:
                suite_failed = False
                describe_name = suite["title"]
                describe_name = describe_name.strip()
                suite_item_id = service.start_test_item(
                    name=describe_name,
                    start_time=timestamp(),
                    item_type="SCENARIO",
                    parent_item_id=spec_file_item_id,
                )
                for test in suite["tests"]:
                    test_name = test["title"]
                    test_name = test_name.strip()
                    test_name = describe_name + " -- " + test_name
                    item_id = service.start_test_item(
                        name=test_name,
                        description=spec,
                        start_time=timestamp(),
                        item_type="STEP",
                        parent_item_id=suite_item_id,
                    )
                    if test["fail"]:
                        test_name = test_name.replace(":", "")
                        test_name = test_name.replace("/", "")
                        estack = test["err"]["estack"]

                        for n in screenshot_name_combinations:
                            file_name = test_name + n + " (failed).png"
                            path = os.path.join(
                                args.screenshot_path, dir_name, file_name
                            )
                            if os.path.isfile(path):
                                break

                        # If for any reason file is not found
                        ss = None
                        if os.path.isfile(path):
                            with open(path, "rb") as image_file:
                                file_data = image_file.read()
                            ss = {
                                "name": f"{test_name}.png",
                                "data": file_data,
                                "mime": "image/png",
                            }
                        else:
                            print(f"Screenshot {path} not found")

                        service.log(
                            time=timestamp(),
                            message=estack,
                            level="INFO",
                            attachment=ss,
                            item_id=item_id,
                        )

                        service.finish_test_item(
                            item_id=item_id, end_time=timestamp(), status="FAILED"
                        )

                        suite_failed = True
                        spec_failed = True
                    elif test["skipped"]:
                        service.finish_test_item(
                            item_id=item_id, end_time=timestamp(), status="SKIPPED"
                        )

                        # Marking Spec, Suite as FAILED for SKIPPED Tests also.
                        # As we don't expect Tests to Skip
                        suite_failed = True
                        spec_failed = True
                    else:
                        service.finish_test_item(
                            item_id=item_id, end_time=timestamp(), status="PASSED"
                        )

                service.finish_test_item(
                    item_id=suite_item_id,
                    end_time=timestamp(),
                    status="FAILED" if suite_failed else "PASSED",
                )
            service.finish_test_item(
                item_id=spec_file_item_id,
                end_time=timestamp(),
                status="FAILED" if spec_failed else "PASSED",
            )
    response = service.finish_launch(end_time=timestamp())
    launch_uuid = response["id"]
    with open(REPORTPORTAL_LAUNCHID_FILE_NAME, "w") as f:
        f.write(launch_uuid)
    service.terminate()


def get_argparser():
    parser = argparse.ArgumentParser()
    optional = parser._action_groups.pop()  # Edited this line
    required = parser.add_argument_group("required arguments")
    required.add_argument(
        "--result_file_path",
        type=str,
        required=True,
        help="Mochawesome Result File JSON",
    )
    required.add_argument(
        "--launch_name",
        type=str,
        required=True,
        help="RP Launch Name",
    )
    required.add_argument(
        "--screenshot_path",
        type=str,
        default="cypress/screenshots",
        help="Screenshot Parent Directory Path",
    )

    parser._action_groups.append(optional)
    return parser.parse_args()


def main():
    args = get_argparser()
    post_result_to_reportportal(args)


if __name__ == "__main__":
    main()

Enhance Communication with Email Notifications

Notify stakeholders of test results via email using AWS SNS. This optional step ensures that detailed notifications are sent to the specified recipients.

stage('Mail Reports') {
      when {
        expression {
          params.NOTIFY
        }
      }
      steps {
          script {
            sh '''
                #!/usr/bin/env bash
                set -exu

                if [ -n "${BUILD}" ]; then
                  RELEASE="${BUILD}"
                fi

                docker run \
                  -t \
                  --entrypoint="/bin/sh" \
                  -e AWS_ACCESS_KEY_ID \
                  -e AWS_SECRET_ACCESS_KEY \
                  -v "$WORKSPACE:/e2e" \
                  -w /e2e \
                  cypress/included:12.14.0 \
                  -c "node cypress/fixtures/mail.js --SNS_ARN=${UI_AUTOMATION_SNS_ARN} --release=${RELEASE} --buildURL=${BUILD_URL} --portalURL=${BASE_URL}"
              '''
          }
      }
    }

mail.js

// Code example for sending email notifications using AWS SNS
const { SNSClient, PublishCommand } = require("@aws-sdk/client-sns");

const utils = require("./utils");
const jsonData = require("./mochawesome.json");
const { on } = require("events");

const args = utils.getArgs();

const SNS_ARN = args.SNS_ARN;
const release = args.release;
const host = args.portalURL;
const buildURL = args.buildURL;
const reportPortalUUIDFilePath = 'reportPortalUUID.txt';
const fs = require('fs');
const reportPortalLaunchUUID = fs.readFileSync(reportPortalUUIDFilePath,{encoding:'utf8', flag:'r'});
const reportPortalLink = "https://<url>/launches/all/" + reportPortalLaunchUUID;
const mochaAwesomeReportURL = buildURL + "artifact/mochawesome_reports/mochawesome.html";
const stats = jsonData["stats"];
const duration = stats["duration"];
// a client can be shared by different commands.
const client = new SNSClient({ region: "us-west-2" });

function convertDurationStringFormat(time) {
  let minutes = Math.floor((duration / (1000 * 60)) % 60),
    hours = Math.floor((duration / (1000 * 60 * 60)) % 24);

  hours = (hours < 10) ? "0" + hours : hours;
  minutes = (minutes < 10) ? "0" + minutes : minutes;

  return hours + " hr " + minutes + " min";
}

let subject = `Define your mail subject here`;

let msg = `Define mail body here`;

let params = {
  Message: msg,
  Subject: subject,
  TopicArn: SNS_ARN,
};

client
  .send(new PublishCommand(params))
  .then((data) => {
    console.log(data);
  })
  .catch((error) => {
    console.log(error);
  })
  .finally(() => {
    // TODO: Check if any Cleanup operation is required
  });

Ensure Test Success with Status Check and Artifact Archiving

Conclude with a status check to ascertain test success. Archive artifacts, including the mochawesome.json and HTML report, for later analysis and reference.

stage('Tests Passing?') {
      steps {
          script {
            sh '''
                #!/usr/bin/env bash
                set -exu

                docker run \
                  -t \
                  --entrypoint="/bin/sh" \
                  -v "$WORKSPACE:/e2e" \
                  -w /e2e \
                  cypress/included:12.14.0 \
                  -c "node cypress/fixtures/check_test_status.js"
              '''
          }
        }
        post {
          always {
            archiveArtifacts artifacts: 'cypress/fixtures/mochawesome.json', fingerprint: true
            archiveArtifacts artifacts: 'mochawesome_reports/**/*.*'
            cleanWs()
          }
        }
      }

Pass Or Fail The Jenkins Pipeline Based On Combined Results: Check_test_status.js

Now, let’s run a script to analyze Mochawesome JSON reports and determine whether any tests failed or were skipped. Depending on the results, we will mark the Jenkins pipeline as being successful or having failed accordingly.

const jsonData = require("./mochawesome.json");

const failures = jsonData["stats"]["failures"];
const skipped = jsonData["stats"]["skipped"];

if (failures != 0 || skipped != 0) {
  throw `${failures} Failed & ${skipped} Skipped Tests. Failing the BUILD!!!`
}

Conclusion

Elevate your web testing with this streamlined pipeline for parallel Cypress test execution, Docker optimization, comprehensive Report Portal reporting, and effective email notifications. These strategies will accelerate testing cycles, enhance communication, and ensure software quality leading to efficient and high-quality results. Happy testing and reporting!

Mohammed Lareb Zafar

Related Posts

Explore Distributed SQL and YugabyteDB in Depth

Discover the future of data management.
Learn at Yugabyte University
Get Started
Browse Yugabyte Docs
Explore docs
PostgreSQL For Cloud Native World
Read for Free