Skip to content

Minimal Wasm extension

We'll create a minimal Wasm extension in this lab and run it locally using Envoy.

We'll start by creating a new folder for our extension, initializing the Go module, and downloading the SDK dependency:

mkdir wasm-extension && cd wasm-extension
go mod init wasm-extension

Next, let's create the main.go file where the code for our Wasm extension will live. We'll start with the minimal code:

main.go
package main

import (
    "github.com/tetratelabs/proxy-wasm-go-sdk/proxywasm"
    "github.com/tetratelabs/proxy-wasm-go-sdk/proxywasm/types"
)

func main() {
    proxywasm.SetVMContext(&vmContext{})
}

type vmContext struct {
    // Embed the default VM context here,
    // so that we don't need to reimplement all the methods.
    types.DefaultVMContext
}

// Override types.DefaultVMContext.
func (*vmContext) NewPluginContext(contextID uint32) types.PluginContext {
    return &pluginContext{}
}

type pluginContext struct {
    // Embed the default plugin context here,
    // so that we don't need to reimplement all the methods.
    types.DefaultPluginContext
}

// Override types.DefaultPluginContext.
func (*pluginContext) NewHttpContext(contextID uint32) types.HttpContext {
    proxywasm.LogInfo("NewHttpContext")
    return &httpContext{contextID: contextID}
}

type httpContext struct {
    // Embed the default http context here,
    // so that we don't need to reimplement all the methods.
    types.DefaultHttpContext
    contextID uint32
}

Save the above to main.go.

Let's download the dependencies and then we can build the extension to check everything is good:

# Download the dependencies
go mod tidy

# Build the wasm file
tinygo build -o main.wasm -scheduler=none -target=wasi main.go

The build command should run successfully and generate a file called main.wasm.

We'll use func-e to run a local Envoy instance to test our built extension.

First, we need an Envoy config that will configure the extension:

envoy.yaml
static_resources:
  listeners:
    - name: main
      address:
        socket_address:
          address: 0.0.0.0
          port_value: 18000
      filter_chains:
        - filters:
            - name: envoy.http_connection_manager
              typed_config:
                "@type": type.googleapis.com/envoy.extensions.filters.network.http_connection_manager.v3.HttpConnectionManager
                stat_prefix: ingress_http
                codec_type: auto
                route_config:
                  name: local_route
                  virtual_hosts:
                    - name: local_service
                      domains:
                        - "*"
                      routes:
                        - match:
                            prefix: "/"
                          direct_response:
                            status: 200
                            body:
                              inline_string: "hello world\n"
                http_filters:
                  - name: envoy.filters.http.wasm
                    typed_config:
                      "@type": type.googleapis.com/udpa.type.v1.TypedStruct
                      type_url: type.googleapis.com/envoy.extensions.filters.http.wasm.v3.Wasm
                      value:
                        config:
                          vm_config:
                            vm_id: "my_vm"
                            runtime: "envoy.wasm.runtime.v8"
                            code:
                              local:
                                filename: "main.wasm"
                  - name: envoy.filters.http.router
                    typed_config:
                      "@type": type.googleapis.com/envoy.extensions.filters.http.router.v3.Router
admin:
  access_log_path: "/dev/null"
  address:
    socket_address:
      address: 0.0.0.0
      port_value: 8001

Save the above to envoy.yaml.

The Envoy configuration sets up a single listener on port 18000 that returns a direct response (HTTP 200) with body hello world. Inside the http_filters section, we're configuring the envoy.filters.http.wasm filter and referencing the local WASM file (main.wasm) we've built earlier.

Let's run the Envoy with this configuration in the background:

func-e run -c envoy.yaml &

Envoy instance should start without any issues. Once it's started, we can send a request to the port Envoy is listening on (18000):

curl localhost:18000
[2022-01-25 22:21:53.493][5217][info][wasm] [source/extensions/common/wasm/context.cc:1167] wasm log: NewHttpContext
hello world

The output shows the single log entry coming from the Envoy proxy. This is the LogInfo function we called in the NewHttpContext callback. The NewHttpContext is called for each new HTTP stream. Similarly, a NewTcpContext method gets called for each new TCP connection.

We can also use the wasm-objdump tool from the WebAssembly Binary Toolkit to see the functions and how they map to proxy Proxy Wasm ABI:

wasm-objdump main.wasm --section=export -x 
wasm-objdump output
main.wasm:      file format wasm 0x1

Section Details:

Export[31]:
- memory[0] -> "memory"
- func[21] <malloc> -> "malloc"
- func[22] <free> -> "free"
- func[23] <calloc> -> "calloc"
- func[24] <realloc> -> "realloc"
- func[25] <posix_memalign> -> "posix_memalign"
- func[26] <aligned_alloc> -> "aligned_alloc"
- func[27] <malloc_usable_size> -> "malloc_usable_size"
- func[37] <_start> -> "_start"
- func[39] <proxy_on_memory_allocate> -> "proxy_on_memory_allocate"
- func[40] <proxy_on_vm_start> -> "proxy_on_vm_start"
- func[41] <proxy_on_configure> -> "proxy_on_configure"
- func[42] <proxy_on_new_connection> -> "proxy_on_new_connection"
- func[44] <proxy_on_downstream_data> -> "proxy_on_downstream_data"
- func[46] <proxy_on_downstream_connection_close> -> "proxy_on_downstream_connection_close"
- func[48] <proxy_on_upstream_data> -> "proxy_on_upstream_data"
- func[50] <proxy_on_upstream_connection_close> -> "proxy_on_upstream_connection_close"
- func[52] <proxy_on_request_headers> -> "proxy_on_request_headers"
- func[53] <proxy_on_request_body> -> "proxy_on_request_body"
- func[54] <proxy_on_request_trailers> -> "proxy_on_request_trailers"
- func[55] <proxy_on_response_headers> -> "proxy_on_response_headers"
- func[56] <proxy_on_response_body> -> "proxy_on_response_body"
- func[57] <proxy_on_response_trailers> -> "proxy_on_response_trailers"
- func[58] <proxy_on_http_call_response> -> "proxy_on_http_call_response"
- func[59] <proxy_on_context_create> -> "proxy_on_context_create"
- func[61] <proxy_on_log> -> "proxy_on_log"
- func[63] <proxy_on_done> -> "proxy_on_done"
- func[64] <proxy_on_delete> -> "proxy_on_delete"
- func[65] <proxy_on_queue_ready> -> "proxy_on_queue_ready"
- func[66] <proxy_on_tick> -> "proxy_on_tick"
- func[67] <proxy_abi_version_0_2_0> -> "proxy_abi_version_0_2_0"

This shows how we can produce a Proxy Wasm ABI compatible binary without knowing anything about the ABI and just using the SDK and letting the SDK abstract that complexity.

You can stop the Envoy proxy by bringing the process to the foreground with fg and pressing Ctrl+C to stop it.