Grokking Algorithms. Bhargava A.
Manning Publications Co., 2016. — 256 p.
This book is designed to be easy
to follow. I avoid big leaps of thought. Any time a new concept is introduced, I
explain it right away or tell you when I’ll explain it. Core concepts are
reinforced with exercises and multiple explanations so that you can check your
assumptions and make sure you’re following along. I lead with examples. Instead
of writing symbol soup, my goal is to make it easy for you to visualize these
concepts. I also think we learn best by being able to recall something we
already know, and examples make recall easier. So when you’re trying to remember
the diference between arrays and linked lists (explained in chapter 2), you can
just think about getting seated for a movie. Also, at the risk of stating the
obvious, I’m a visual learner. his book is chock-full of images. he contents of
the book are carefully curated. here’s no need to write a book that covers every
sorting algorithm—that’s why we have Wikipedia and Khan Academy. All the
algorithms I’ve included are practical. I’ve found them useful in my job as a
sotware engineer, and they provide a good foundation for more complex topics.
Format: pdf
Size:
6,5
Mb
View, download:
yandex.disk
Russian language: Современный
JavaScript для нетерпеливых. Хорстман К.С.
(2021, 288с.)
Contents
preface xiii
acknowledgments xiv
about this book xv
1 Introduction to algorithms i
Introduction 1
What you'll learn about performance 2
What you'll learn about solving problems 2
Binary search 3
A better way to search 5
Running time 10
Big О notation 10
Algorithm running times grow at different rates 11
Visualizing different Big О run times 13
Big О establishes a worst-case run time 15
Some common Big О run times 15
The traveling salesperson 17
Recap 19
2 Selection sort 21
How memory works 22
Arrays and linked lists 24
Linked lists 25
Arrays 26
Terminology 27
Inserting into the middle of a list 29
Deletions 30
Selection sort 32
Recap 36
3 Recursion 37
Recursion 38
Base case and recursive case 40
The stack 42
The call stack 43
The call stack with recursion 45
Recap 50
4 Quicksort 51
Divides conquer 52
Quicksort 60
Big О notation revisited 66
Merge sort vs. quicksort 67
Average case vs. worst case 68
Recap 72
5 Hash tables 73
Hash functions 76
Use cases 79
Using hash tables for lookups 79
Preventing duplicate entries 81
Using hash tables as a cache 83
Recap 86
Collisions 86
Performance 88
Load factor 90
A good hash function 92
Recap 93
6 Breadth-first search 95
Introduction to graphs 96
What is a graph? 98
Breadth-first search 99
Finding the shortest path 102
Queues 103
Implementing the graph 105
Implementing the algorithm 107
Running time 111
Recap 114
7 Dijkstra's algorithm 115
Working with Dijkstra's algorithm 116
Terminology 120
Trading for a piano 122
Negative-weight edges 128
Implementation 131
Recap 140
8 Greedy algorithms 141
The classroom scheduling problem 142
The knapsack problem 144
The set-covering problem 146
Approximation algorithms 147
NP-complete problems 152
Traveling salesperson, step by step 153
How do you tell if a problem is NP-complete? 158
Recap 160
9 Dynamic programming 161
The knapsack problem 161
The simple solution 162
Dynamic programming 163
Knapsack problem FAQ 171
What happens if you add an item? 171
What happens if you change the order of the rows? 174
Can you fill in the grid column-wise instead of row-wise? 174
What happens if you add a smaller item? 174
Can you steal fractions of an item? 175
Optimizing your travel itinerary 175
Handling items that depend on each other 177
Is it possible that the solution will require more than two sub-knapsacks? 177
Is it possible that the best solution doesn't fill
the knapsack completely? 178
Longest common substring 178
Making the grid 179
Filling in the grid 180
The solution 182
Longest common subsequence 183
Longest common subsequence—solution 184
Recap 186
10 K-nearest neighbors 187
Classifying oranges vs. grapefruit 187
Building a recommendations system 189
Feature extraction 191
Regression 195
Picking good features 198
Introduction to machine learning 199
OCR 199
Building a spam filter 200
Predicting the stock market 201
Recap 201
11 Where to go next 203
Trees 203
Inverted indexes 206
The Fourier transform 207
Parallel algorithms 208
MapReduce 209
Why are distributed algorithms useful? 209
The map function 209
The reduce function 210
Bloom filters and HyperLogLog 211
Bloom filters 212
HyperLogLog 213
The SHA algorithms 213
Comparing files 214
Checking passwords 215
Locality-sensitive hashing 216
Diffie-Hellman key exchange 217
Linear programming 218
Epilogue 219
answers to exercises 221
index 235
О том, как читать книги в форматах
pdf,
djvu
- см. раздел "Программы; архиваторы; форматы
pdf, djvu
и др."
|