Anyone who is pursuing an informational technology degree should understand the importance of data structures. From application programming to Web development, projects that require knowledge of arrays, lists, trees and other structures are very common in the world of information technology. When it comes to designing software-based solutions to problems faced by businesses, data structures are one of the key ingredients to a properly functioning base of code. Choosing the correct data structure for a program requires knowledge of algorithms, time complexity, space complexity, and data structure analysis.

### Time Complexity

The time complexity of a program is the upper bound of the runtime measured by how many times a program input must be processed before it returns a solution. In plain language, it refers to the slowest function required for the program to do its job. One common data structure used in computer programming is the binary search tree. It’s used to efficiently store and retrieve data from a sorted data set. The fastest possible program in computer science is the program that returns a solution in one step. It has a time complexity denoted by a Big O notation of *O(1)*. In contrast, retrieving data from a binary search tree has a time complexity of *O(log n)* because it can’t perform more quickly than its slowest function, which is defined by the *log(n)* steps required to traverse the path from the tree root to the node containing the search key.

### Space Complexity

Space complexity refers to the amount of memory required to implement a data structure in a program. As with the Big O notation of time complexity, space complexity is defined by the program input, which is represented by the integer *n*. As *n* grows larger, the amount of memory required to implement the data structure increases by the function *f(n)*, where *f* is the space complexity written in Big O notation. A program with polynomial space complexity requires memory equal to the program input *n* raised to the power indicated in *f*. The function can be visualized as a curve on a graph where the *x* axis denotes the program input and the *y* axis denotes the memory requirements of the program.

### Algorithms

The proper choice of a data structure requires the appropriate choice of algorithm for a particular program. Most sorting algorithms use lists or arrays to quickly sort datasets in *O(n log n)* time. Merge sort and quicksort are two of the most common sorting algorithms used for general purpose applications. Merge sort is typically used in implementations that store data in array structures while quicksort is typically used when data sets are stored in linked lists. According to W3Schools, merge sort is based on the concept of “divide and conquer,” developed by John von Neumann in 1945. It works by evenly dividing an array until it can no longer be divided, which requires *log(n)* steps. Then the divided arrays are merged together in sorted order, which requires *n* steps. Altogether, the algorithm requires *n log(n)* steps to complete.

A thorough understanding of data structures is an important part of studying to become an IT professional. When it comes to getting an informational technology degree, learning about data structures and algorithms should be a top priority.