In software testing, negative test cases evaluate the system’s behavior under test when an end-user performs a “wrong” or unexpected action. In addition, negative tests look at how the system responds when these actions occur. These tests are a crucial part of assessing any software product, but developers sometimes overlook them in their quest to meet initial requirements. Negative use cases will occasionally feature within requirements, but generally, only the “happy path” is followed. The “happy path” is a term meaning the planned for and thus, expected behavior of the end-user in typical orderly scenarios.

Negative Test Cases

Testers find negative test cases if the user ignores instructions or normal usage to intentionally or unintentionally turn off the standard road. When planning negative test cases, we need to think like a user trying to break something. What if I do this? What if I try that? No matter how wild, what options does the user have for each scenario? Anything you can think of, try it.

This article details real-world negative testing and provides examples to make the topic easier to understand, while the post what is negative testing offers a broader definition.

Examples of Negative Testing

Let’s go over some real-world examples of negative testing.

1. Enter characters that are not allowed in an input field

Entering disallowed characters in an input field is probably the most widely used negative test. An error message should be displayed when a user enters characters that are not permitted. For example, a username field might not allow an @ symbol.

When a user attempts to submit registration with an invalid character, an error displays that notifies the user of the requirement. By testing this scenario, we verify:

  1. The message about the field requirements is displayed.
  2. The registration is not processed.
  3. Any other errors are not displayed.
  4. The app doesn’t crash.

2. Attempt to submit without any text in a required field

Another simple example is the absence of any text input in a required field. You can run a negative test by simply leaving a required field blank and attempting to submit. We verify the same three things in this case:

  1. The message about the field requirements is displayed.
  2. The registration is not processed.
  3. Some other unexpected error is not displayed.
nagative test case wrong url

Let’s say you are testing a new feature that includes a button for taking the user to another site or page. When you enter a valid URL in the CMS (Content Management System) for the designated button, the user clicks the button which opens the intended page. What if there is a typo in the URL? This is a clear example of a negative test case. To test, you would enter an invalid URL in the CMS for that button then save it. We can then find out a number of things:

  1. Does the CMS save the update containing a bad URL?
  2. Assuming the CMS saves the bad URL, what happens when clicking the button?
  3. Does the app crash?

There are a couple of points where we could see an error in this case. First, we might see an error in the CMS or a 404 error message when clicking the button.

4. Attempt to submit a comment without logging in

nagative test case invalid email

To validate the functionality of a comment field, we’ll test a lot of positive cases after logging in. One negative test would include attempting to submit a comment before logging in. If a user enters a comment, then clicks submit before authenticating, they should receive an error message informing them of the situation. While testing this, we must verify the comment has not been lost by confirming the following:

  1. A correct error message displays regarding authentication
  2. Comment text remains
  3. Any other unexpected error is not displayed
  4. The app does not crash

5. Attempt to submit after expired authentication

This scenario is a variation of the previous test case. In some instances, authentication automatically expires after a set period for security purposes. In this case, we would begin the form completion or review/comment entry and wait until after authentication expires to complete the submission. We would verify the following after attempting to submit:

  1. Correct error displays regarding authentication
  2. The text entered is not lost
  3. Some other unexpected error is not displayed
  4. After logging in, the user can successfully submit the data previously entered
  5. The app doesn’t crash

6. Attempt to submit after permanent expiration

In some cases, there is a deadline for submission. Perhaps, an entry into a competition requires submission by midnight on a particular day. Another example is a sporting event where the user must ascertain a winner ahead of a race or match. In cases like these, we may alter the deadline in the test environment to validate this scenario. Verification of this negative case would include:

  1. The correct error message is displayed to the user to inform them of the situation
  2. The submission is not processed
  3. Attempting to submit repeatedly continually produces the same error
  4. The app doesn’t crash, and no other unexpected errors are present.

7. Verify the presence of a 404 message after page removal

nagative test case 404

After intentionally removing a web page, a defined behavior should occur when the user attempts to access the page. In some cases, a redirect manages expectations created by the missing page. Alternatively, the expected behavior may be that a 404 error message displays below the website’s navigation bar. We must let the user know the page has been purposely removed, and we also want them to click on the navbar to access other pages on the site. The user may have tried to access this missing page from an old link posted on social media or because they had the link bookmarked. In either case, we must validate the expected behavior:

  1. No unexpected errors are displayed
  2. The expected 404 error message with the standard page layout must be present, including the site menu

8. Attempt to access a page without permissions

A standard 403 authentication error should display when an attempt is made to access a restricted page. This could be when a user no longer has access privileges or there’s a malicious attempt to gain access. As well as a standard 403 error message, the site’s menu should display to offer users a way to access other content. For this case, we will validate:

  1. No unexpected errors are displayed
  2. The expected 404 error message with the standard page layout must be present, including the site menu
  3. Most importantly, the user does not gain access to the page.

9. Manually refresh while waiting for a confirmation message

In a scenario where a user is submitting a form, making a purchase, or conducting a bank transfer, we can test some unusual behavior like quickly refreshing after the user submits the request. We could test refreshing quickly once or multiple times to see what happens. In this case, we could validate:

  1. Multiple submissions do not result in repeat purchases, transfers, comments, etc.
  2. No unexpected errors are displayed
  3. The confirmation message shows as expected

10. Press back or escape quickly after a submission

Messing around with the timing of user interactions is another excellent way to perform negative testing. Quickly hitting back (or escape) on a mobile device after moving forward in the workflow will sometimes cause issues. You can also try hitting back multiple times quickly to see what happens. This is a good test for simply navigating an app or submitting a form for comment or review. After quickly clicking or pressing back, we will verify:

  1. The correct page displays eventually
  2. The initial request is successful (if submitting a form)
  3. No unexpected errors occur
  4. The app (especially on mobile) does not crash

Conclusion

There is no compelling reason to separate positive and negative test cases when formulating a test plan. The negative cases are just as important as the positive, or maybe more so because the developer will generally only look at the positive “happy-path” scenarios before they hand over the code to QA. Finding bugs in the negative testing scenarios is where QA can really shine and show big-time value.