From eaed5dd016ebc0d2184207760e7f744e8a7f6320 Mon Sep 17 00:00:00 2001 From: Balki Kodarapu Date: Sat, 18 Mar 2017 08:37:16 -0700 Subject: [PATCH] Delete duplicate does in 3 design examples (#24) Delete duplicate word "does" in design exercises --- solutions/system_design/pastebin/README.md | 2 +- solutions/system_design/social_graph/README.md | 2 +- solutions/system_design/web_crawler/README.md | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/solutions/system_design/pastebin/README.md b/solutions/system_design/pastebin/README.md index 7f95ab94..3cc242ce 100644 --- a/solutions/system_design/pastebin/README.md +++ b/solutions/system_design/pastebin/README.md @@ -95,7 +95,7 @@ An alternative to a relational database acting as a large hash table, we could u * The **Client** sends a create paste request to the **Web Server**, running as a [reverse proxy](https://github.com/donnemartin/system-design-primer#reverse-proxy-web-server) * The **Web Server** forwards the request to the **Write API** server -* The **Write API** server does does the following: +* The **Write API** server does the following: * Generates a unique url * Checks if the url is unique by looking at the **SQL Database** for a duplicate * If the url is not unique, it generates another url diff --git a/solutions/system_design/social_graph/README.md b/solutions/system_design/social_graph/README.md index fac4b3fd..947956f0 100644 --- a/solutions/system_design/social_graph/README.md +++ b/solutions/system_design/social_graph/README.md @@ -104,7 +104,7 @@ We won't be able to fit all users on the same machine, we'll need to [shard](htt * The **Client** sends a request to the **Web Server**, running as a [reverse proxy](https://github.com/donnemartin/system-design-primer#reverse-proxy-web-server) * The **Web Server** forwards the request to the **Search API** server * The **Search API** server forwards the request to the **User Graph Service** -* The **User Graph Service** does does the following: +* The **User Graph Service** does the following: * Uses the **Lookup Service** to find the **Person Server** where the current user's info is stored * Finds the appropriate **Person Server** to retrieve the current user's list of `friend_ids` * Runs a BFS search using the current user as the `source` and the current user's `friend_ids` as the ids for each `adjacent_node` diff --git a/solutions/system_design/web_crawler/README.md b/solutions/system_design/web_crawler/README.md index b84f07ee..7876b943 100644 --- a/solutions/system_design/web_crawler/README.md +++ b/solutions/system_design/web_crawler/README.md @@ -213,7 +213,7 @@ We might also choose to support a `Robots.txt` file that gives webmasters contro * The **Client** sends a request to the **Web Server**, running as a [reverse proxy](https://github.com/donnemartin/system-design-primer#reverse-proxy-web-server) * The **Web Server** forwards the request to the **Query API** server -* The **Query API** server does does the following: +* The **Query API** server does the following: * Parses the query * Removes markup * Breaks up the text into terms