{"id":251754,"date":"2017-02-28T05:59:10","date_gmt":"2017-02-28T12:59:10","guid":{"rendered":"http:\/\/css-tricks.com\/?p=251754"},"modified":"2017-02-28T06:17:46","modified_gmt":"2017-02-28T13:17:46","slug":"computer-science-distilled-chapter-2-complexity","status":"publish","type":"post","link":"https:\/\/css-tricks.com\/computer-science-distilled-chapter-2-complexity\/","title":{"rendered":"Computer Science Distilled, Chapter 2: Complexity"},"content":{"rendered":"
This is a full chapter excerpt from Wladston Viana Ferreira Filho’s brand new book Computer Science Distilled<\/a> which he has graciously allowed for us to publish here.<\/div>\n

In almost every computation, a variety of arrangements for the processes is possible. It is essential to choose that arrangement which shall tend to minimize the time necessary for the calculation.<\/p><\/blockquote>\n

\u2014Ada Lovelace<\/p>\n

How much time does it take to sort 26 shuffled cards? If instead, you had 52 cards, would it take twice as long? How much longer would it take for a thousand decks of cards? The answer is intrinsic to the method<\/strong> used to sort the cards.<\/p>\n

A method is a list of unambiguous instructions for achieving a goal. A method that always requires a finite series of operations is called an algorithm<\/strong>. For instance, a card-sorting algorithm is a method that will always specify some operations to sort a deck of 26 cards per suit and per rank.<\/p>\n

Less operations need less computing power. We like fast solutions, so we monitor the number of operations in our algorithms. Many algorithms require a fast-growing number of operations when the input grows in size. For example, our card-sorting algorithm could take few operations to sort 26 cards, but four times more operations to sort 52 cards!<\/p>\n

To avoid bad surprises when our problem size grows, we find the algorithm’s time complexity<\/strong>. In this chapter, you’ll learn to:<\/p>\n