Securing application webhooks in Elixir

Securing application webhooks in Elixir - Coletiv Blog

Not long ago, I had a task that involved securing a webhook from an external API, making it possible to verify if the request was coming from the allowed application (authenticity) and if the received payload matched the one sent from the application, by verifying if the hashes matched (integrity).

Using SHA256 HMAC payload verification, the flow from the validation was as follows:

  1. Receive an incoming request from the external API.
  2. Extract the text payload as an array of bytes. The entire body of the POST request is used.
  3. Compute a SHA256 HMAC digest for the array of bytes. If your external API implementation has multiple HMAC keys, compute one digest for each of the HMAC keys.
  4. Base64 encodes each of the digests.
  5. Compare the base 64 digest(s) to the values of the signature headers. Compare the one you computed using your application’s key(s). At least one of the computed digests must exactly match its corresponding header value. If there is no match then the notification may be compromised and it should not be trusted. Only one match is required.

Knowing this I started to code the first iteration to ensure if the request was valid…

def is_request_valid?(conn) do
  # Signature received from the request (Step 1)
  incoming_signature =
    conn
    |> get_req_header("signature")
    |> Enum.at(0)

  # Body payload of the request (Step 2)
  {:ok, payload, _conn} = Plug.Conn.read_body(conn)

  # Stored secret, generated from the external app
  stored_secret = Application.gen_env(:example_app, :webhook_secret)

  # Hash generation function, using elixir crypto library (Step 3 and 4)
  generated_hash =
    :crypto.hmac(:sha256, stored_secret, payload)
    |> Base.encode64()

  # Either returns true if request is valid, or false if not (Step 5)
  generated_hash == incoming_signature
end

When testing this approach, the result was invariably a False boolean ☹️. When debugging the code, the expected JSON (already decoded in a map) was returned with no errors whatsoever. Everything looked well coded. What was happening? Time to investigate!

Problem-solving

As software developers, we are constantly faced with new problems and new challenges to solve, and this was not outside of that spectrum. Working at Coletiv, and always having my peers at my back (even in times of social isolation), I confronted the problem with them! Without removing the challenge from the problem they suggested I should analyze the Plug dependency more carefully.

I started looking for clues 🕵🏽 in Plug hexdocs to see if I could find “the murder weapon”. As it seems, the guilty party was the payload! The plug provides a specification for web application components and adapters for web servers. When we receive the request, our Plug parses the content from it with Poison (our parser of choice).

The representation of the parsed data in Elixir is different from the representation of the raw received data. Decoding a byte array and encoding it again will not yield the same result as the original byte array had, even though it encodes the same objects. This happens because the in-memory representation of the parsed data in Elixir may order the objects differently from the byte array. When computing the hash function, it could not match the signatures in the request header. So, is there a way to access a raw unparsed version of this endpoint’s payload? Removing the parsing function from the Plug would surely break all of our other endpoints which rely on it…

Cache, I choose you!

Digging up the problem, I found out that the Parsers Plug supports cache body reading, making it possible to cache the raw body to perform verifications later, by storing the cached body in the connection. Following the example at hexdocs…

defmodule CacheBodyReader do
  def read_body(conn, opts) do
    {:ok, body, conn} = Plug.Conn.read_body(conn, opts)
    conn = update_in(conn.assigns[:raw_body], &[body | &1 || []])
    {:ok, body, conn}
  end
end

plug(
  Plug.Parsers,
  parsers: [:urlencoded, :multipart, :json],
  pass: ["*/*"],
  body_reader: {CacheBodyReader, :read_body, []},
  json_decoder: Poison
)

This way, our requests can access the normal parsed body, but also additionally have access to the raw body, if needed. After this change, we had to alter some code in our request validation.

  # Body payload of the request (Step 2)
  [payload] = Map.get(conn.assigns, :raw_body)

Finally, when testing the webhook, everything was working as intended and the requests made by the external API were being accepted by our application!

Thank you for reading!

Thank you so much for reading, it means a lot to us! Also don’t forget to follow Coletiv on Twitter and LinkedIn as we keep posting more and more interesting articles on multiple technologies.

In case you don’t know, Coletiv is a software development studio from Porto specialised in Elixir, Web, and App (iOS & Android) development. But we do all kinds of stuff. We take care of UX/UI design, software development, and even security for you.

So, let’s craft something together?