What are Data Structures?

Image of data analyst for our FAQ on Should I Get a Certificate or Associate’s Degree

data structures Anyone who is pursuing an informational technology degree should understand the importance of data structures. From application programming to Web development, projects that require knowledge of arrays, lists, trees and other structures are common in information technology. When it comes to designing software-based solutions to problems faced by businesses, data structures are one of the key ingredients to a properly functioning base of code. Choosing the correct data structure for a program requires knowledge of:

  • algorithms
  • time complexity
  • space complexity
  • data structure analysis

Time Complexity

The time complexity of a program is the upper bound of the runtime measured by how many times a program input must be processed before it returns a solution. In plain language, it refers to the slowest function required for the program to do its job. One common data structure used in computer programming is the binary search tree. It’s used to efficiently store and retrieve data from a sorted data set. The fastest possible program in computer science is the program that returns a solution in one step. It has a time complexity denoted by a Big O notation of O(1). In contrast, retrieving data from a binary search tree has a time complexity of O(log n) because it can’t perform more quickly than its slowest function. It’s defined by the log(n) steps required to traverse the path from the tree root to the node containing the search key.

Space Complexity

Space complexity refers to the amount of memory required to implement a data structure in a program. As with the Big O notation of time complexity, space complexity is defined by the program input. This is represented by the integer n. As n grows larger, the amount of memory required to implement the data structure increases by the function f(n), where f is the space complexity written in Big O notation. A program with polynomial space complexity requires memory equal to the program input n raised to the power indicated in f. The function can be visualized as a curve on a graph where the x axis denotes the program input. The the y axis denotes the memory requirements of the program.

Algorithms

The proper choice of a data structure requires the appropriate choice of algorithm for a particular program. Most sorting algorithms use lists or arrays to quickly sort datasets in O(n log n) time. Merge sort and quicksort are two of the most common sorting algorithms used for general purpose applications. Merge sort is typically used in implementations that store data in array structures while quicksort is typically used when data sets are stored in linked lists. According to W3Schools, merge sort is based on the concept of “divide and conquer,” developed by John von Neumann in 1945. It works by evenly dividing an array until it can no longer be divided, which requires log(n) steps. Then the divided arrays are merged together in sorted order, which requires n steps. Altogether, the algorithm requires n log(n) steps to complete. A thorough understanding of data structures is an important part of studying to become an IT professional. When it comes to getting an informational technology degree, learning about data structures and algorithms should be a top priority. Related Resources:

Brenda Rufener
Author

Julie McCaulley
Expert

Carrie Sealey-Morris
Editor-in-Chief