HTTP Request

Created By vladimir

The HTTP request is the backbone of Web Development. It's the way users interact with the application on the web server. Read this article to understand everything about the HTTP Requests.

What is an HTTP Request

When you open a website through the web browser the web browser sends an HTTP Request to the web server hosting the application. The request is created using the HTTP Protocol. Upon receiving the HTTP request, the web application responds with an HTTP response which contains usually is the web page you requested

How does an HTTP Request look like

Let's examine an HTTP request to a website:

GET / HTTP/1.1

The first line contains the request verb, the request URI and the protocol version.

The request verb can be one of the following:

Request Verb

  1. GET: The Get method retrieves data from the server, as its name implies. Furthermore, it is the most widely used method without a request body. The Get request fires every time you open a website to retrieve its contents. In addition, it is the same as the read operation.
  2. HEAD: The Head method is similar to Get, but it only retrieves the header data rather than the entire response body. We also use it when we need to check the file size of a document without having to download it.
  3. POST: The Post method is used to send information to a server. Using the Post request, you can add or update data. In the request body, we send the information that needs to be updated. In the real world, the Post request is used to update the form data on a website.
  4. PUT: Because it updates the data, the Put method is similar to the Post method. Only difference is that we use it when we need to completely replace an existing entity. PUT methods are also idempotent, which means they always return the same result when called repeatedly.
  5. PATCH: This method is similar to Post and Put, but it's used when we only need to update a portion of the data. Furthermore, unlike the Post and Put methods, the Patch method allows you to send only the entity that needs to be updated in the request body.
  6. DELETE: The Delete method, as its name implies, deletes the server's representations of resources via a specific URL. They also don't have a request body, just like the Get method.
  7. OPTIONS: In comparison to other methods, this one is not widely used. It returns information about the various methods and operations that the server supports at the given URL. It also includes an Allow header with a list of the HTTP methods that are permitted for the resource.

The Request URI

The Request URI, or Uniform Resource Identifier, aids in the identification of the resources to which the request refers. The format of a request URI is shown below.

HTTP Version

The sender can specify the message's format as well as its ability to understand subsequent communications using the HTTP Protocol version. Furthermore, HTTP/1.1 is the widely used standardized protocol version.

Theoretical Children Concepts

A theoretical concept proposes a solution to a theoretical problem
add concept

There are no concepts yet. Help the community by adding one yourself!


Articles, tutorials, courses or books

Django: Request/Response Cycle

March 23, 2019 by Sarthak Kumar Article

1 Reviews
To get a better understanding of the Django framework, we need to know how requests are made and how the end result is delivered to the user. In this article we go over the various stages of requests and how it's processed at each stage to form the final server response. edited

How the Internet Works for Developers - Overview & Frontend

Jan. 2, 2014 by LearnCode Video

No reviews yet. Contribute
A 15 minute video about how a browser gets and displays a web page. It's an overview for understanding the process.

A Minimalist End-to-End Scrapy Tutorial (Part I)

Nov. 5, 2019 by Harry Wang Series

No reviews yet. Contribute
A 5 part tutorial series about how to build a complete web crawler using scrapy. From starting the project using scrapy startproject command to saving the scraped items into the database using SQLAlchemy via pipelines. Parts 4 and 5 take it to the other level where the crawler is deployed to scrapinghub and selenium is implemented for scraping dynamic pages.