ztp/Cargo.lock

2167 lines
53 KiB
Plaintext
Raw Permalink Normal View History

Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
# This file is automatically @generated by Cargo.
# It is not intended for manual editing.
version = 3
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "ahash"
version = "0.7.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fcb51a0695d8f838b1ee009b3fbf66bda078cd64590202a864a8f3e8c4315c47"
dependencies = [
"getrandom",
"once_cell",
"version_check",
]
[[package]]
name = "android_system_properties"
version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "819e7219dbd41043ac279b19830f2efc897156490d7fd6ea916720117ee66311"
dependencies = [
"libc",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "async-trait"
version = "0.1.67"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "86ea188f25f0255d8f92797797c97ebf5631fa88178beb1a46fdf5622c9a00e4"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.4",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "atoi"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d7c57d12312ff59c811c0643f4d80830505833c9ffaebd193d819392b265be8e"
dependencies = [
"num-traits",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "autocfg"
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d468802bab17cbc0cc575e9b053f41e72aa36bfa6b7f55e3529ffa43161b97fa"
[[package]]
name = "axum"
version = "0.6.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "13d8068b6ccb8b34db9de397c7043f91db8b4c66414952c6db944f238c4d3db3"
dependencies = [
"async-trait",
"axum-core",
"bitflags",
"bytes",
"futures-util",
"http",
"http-body",
"hyper",
"itoa",
"matchit",
"memchr",
"mime",
"percent-encoding",
"pin-project-lite",
"rustversion",
"serde",
"serde_json",
"serde_path_to_error",
"serde_urlencoded",
"sync_wrapper",
"tokio",
"tower",
"tower-layer",
"tower-service",
]
[[package]]
name = "axum-core"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b2f958c80c248b34b9a877a643811be8dbca03ca5ba827f2b63baf3a81e5fc4e"
dependencies = [
"async-trait",
"bytes",
"futures-util",
"http",
"http-body",
"mime",
"rustversion",
"tower-layer",
"tower-service",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "base64"
version = "0.13.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9e1b586273c5702936fe7b7d6896644d8be71e6314cfe09d3167c95f712589e8"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "bitflags"
version = "1.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "block-buffer"
version = "0.10.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3078c7629b62d3f0439517fa394996acacc5cbc91c5a20d8c658e77abd503a71"
dependencies = [
"generic-array",
]
[[package]]
name = "bumpalo"
version = "3.12.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0d261e256854913907f67ed06efbc3338dfe6179796deefc1ff763fc1aee5535"
[[package]]
name = "byteorder"
version = "1.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "14c189c53d098945499cdfa7ecc63567cf3886b3332b312a5b4585d8d3a6a610"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "bytes"
version = "1.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "89b2fd2a0dcf38d7971e2194b6b6eebab45ae01067456a7fd93d5547a61b70be"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "cc"
version = "1.0.79"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "50d30906286121d95be3d479533b458f87493b30a4b5f79a607db8f5d11aa91f"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "cfg-if"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "chrono"
version = "0.4.24"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4e3c5919066adf22df73762e50cffcde3a758f2a848b113b586d1f86728b673b"
dependencies = [
"iana-time-zone",
2023-03-24 19:22:28 +00:00
"js-sys",
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
"num-integer",
"num-traits",
2023-03-24 19:22:28 +00:00
"serde",
"time",
"wasm-bindgen",
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
"winapi",
]
[[package]]
name = "codespan-reporting"
version = "0.11.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3538270d33cc669650c4b093848450d380def10c331d38c768e34cac80576e6e"
dependencies = [
"termcolor",
"unicode-width",
]
[[package]]
name = "config"
version = "0.13.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d379af7f68bfc21714c6c7dea883544201741d2ce8274bb12fa54f89507f52a7"
dependencies = [
"async-trait",
"json5",
"lazy_static",
"nom",
"pathdiff",
"ron",
"rust-ini",
"serde",
"serde_json",
"toml",
"yaml-rust",
]
[[package]]
name = "core-foundation"
version = "0.9.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "194a7a9e6de53fa55116934067c844d9d749312f75c6f6d0980e8c252f8c2146"
dependencies = [
"core-foundation-sys",
"libc",
]
[[package]]
name = "core-foundation-sys"
version = "0.8.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5827cebf4670468b8772dd191856768aedcb1b0278a04f989f7766351917b9dc"
[[package]]
name = "cpufeatures"
version = "0.2.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "28d997bd5e24a5928dd43e46dc529867e207907fe0b239c3477d924f7f2ca320"
dependencies = [
"libc",
]
[[package]]
name = "crc"
version = "3.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "86ec7a15cbe22e59248fc7eadb1907dab5ba09372595da4d73dd805ed4417dfe"
dependencies = [
"crc-catalog",
]
[[package]]
name = "crc-catalog"
version = "2.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9cace84e55f07e7301bae1c519df89cdad8cc3cd868413d3fdbdeca9ff3db484"
[[package]]
name = "crossbeam-queue"
version = "0.3.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d1cfb3ea8a53f37c40dea2c7bedcbd88bdfae54f5e2175d6ecaff1c988353add"
dependencies = [
"cfg-if",
"crossbeam-utils",
]
[[package]]
name = "crossbeam-utils"
version = "0.8.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3c063cd8cc95f5c377ed0d4b49a4b21f632396ff690e8470c29b3359b346984b"
dependencies = [
"cfg-if",
]
[[package]]
name = "crypto-common"
version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1bfb12502f3fc46cca1bb51ac28df9d618d813cdc3d2f25b9fe775a34af26bb3"
dependencies = [
"generic-array",
"typenum",
]
[[package]]
name = "cxx"
version = "1.0.93"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a9c00419335c41018365ddf7e4d5f1c12ee3659ddcf3e01974650ba1de73d038"
dependencies = [
"cc",
"cxxbridge-flags",
"cxxbridge-macro",
"link-cplusplus",
]
[[package]]
name = "cxx-build"
version = "1.0.93"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fb8307ad413a98fff033c8545ecf133e3257747b3bae935e7602aab8aa92d4ca"
dependencies = [
"cc",
"codespan-reporting",
"once_cell",
"proc-macro2",
"quote",
"scratch",
"syn 2.0.4",
]
[[package]]
name = "cxxbridge-flags"
version = "1.0.93"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "edc52e2eb08915cb12596d29d55f0b5384f00d697a646dbd269b6ecb0fbd9d31"
[[package]]
name = "cxxbridge-macro"
version = "1.0.93"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "631569015d0d8d54e6c241733f944042623ab6df7bc3be7466874b05fcdb1c5f"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.4",
]
[[package]]
name = "digest"
version = "0.10.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8168378f4e5023e7218c89c891c0fd8ecdb5e5e4f18cb78f38cf245dd021e76f"
dependencies = [
"block-buffer",
"crypto-common",
"subtle",
]
[[package]]
name = "dirs"
version = "4.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ca3aa72a6f96ea37bbc5aa912f6788242832f75369bdfdadcb0e38423f100059"
dependencies = [
"dirs-sys",
]
[[package]]
name = "dirs-sys"
version = "0.3.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1b1d1d91c932ef41c0f2663aa8b0ca0342d444d842c06914aa0a7e352d0bada6"
dependencies = [
"libc",
"redox_users",
"winapi",
]
[[package]]
name = "dlv-list"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0688c2a7f92e427f44895cd63841bff7b29f8d7a1648b9e7e07a4a365b2e1257"
[[package]]
name = "dotenvy"
version = "0.15.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1aaf95b3e5c8f23aa320147307562d361db0ae0d51242340f558153b4eb2439b"
[[package]]
name = "either"
version = "1.8.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7fcaabb2fef8c910e7f4c7ce9f67a1283a1715879a7c230ca9d6d1ae31f16d91"
[[package]]
name = "errno"
version = "0.2.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f639046355ee4f37944e44f60642c6f3a7efa3cf6b78c78a0d989a8ce6c396a1"
dependencies = [
"errno-dragonfly",
"libc",
"winapi",
]
[[package]]
name = "errno-dragonfly"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "aa68f1b12764fab894d2755d2518754e71b4fd80ecfb822714a1206c2aab39bf"
dependencies = [
"cc",
"libc",
]
[[package]]
name = "event-listener"
version = "2.5.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0206175f82b8d6bf6652ff7d71a1e27fd2e4efde587fd368662814d6ec1d9ce0"
[[package]]
name = "fastrand"
version = "1.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e51093e27b0797c359783294ca4f0a911c270184cb10f85783b118614a1501be"
dependencies = [
"instant",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "fnv"
version = "1.0.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3f9eec918d3f24069decb9af1554cad7c880e2da24a9afd88aca000531ab82c1"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "foreign-types"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f6f339eb8adc052cd2ca78910fda869aefa38d22d5cb648e6485e4d3fc06f3b1"
dependencies = [
"foreign-types-shared",
]
[[package]]
name = "foreign-types-shared"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "00b0228411908ca8685dba7fc2cdd70ec9990a6e753e89b6ac91a84c40fbaf4b"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "form_urlencoded"
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a9c384f161156f5260c24a097c56119f9be8c798586aecc13afbcbe7b7e26bf8"
dependencies = [
"percent-encoding",
]
[[package]]
name = "futures-channel"
version = "0.3.27"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "164713a5a0dcc3e7b4b1ed7d3b433cabc18025386f9339346e8daf15963cf7ac"
dependencies = [
"futures-core",
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
"futures-sink",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
]
[[package]]
name = "futures-core"
version = "0.3.27"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "86d7a0c1aa76363dac491de0ee99faf6941128376f1cf96f07db7603b7de69dd"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "futures-intrusive"
version = "0.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a604f7a68fbf8103337523b1fadc8ade7361ee3f112f7c680ad179651616aed5"
dependencies = [
"futures-core",
"lock_api",
"parking_lot 0.11.2",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "futures-sink"
version = "0.3.27"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ec93083a4aecafb2a80a885c9de1f0ccae9dbd32c2bb54b0c3a65690e0b8d2f2"
[[package]]
name = "futures-task"
version = "0.3.27"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fd65540d33b37b16542a0438c12e6aeead10d4ac5d05bd3f805b8f35ab592879"
[[package]]
name = "futures-util"
version = "0.3.27"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3ef6b17e481503ec85211fed8f39d1970f128935ca1f814cd32ac4a6842e84ab"
dependencies = [
"futures-core",
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
"futures-sink",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
"futures-task",
"pin-project-lite",
"pin-utils",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "generic-array"
version = "0.14.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bff49e947297f3312447abdca79f45f4738097cc82b06e72054d2223f601f1b9"
dependencies = [
"typenum",
"version_check",
]
[[package]]
name = "getrandom"
version = "0.2.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c05aeb6a22b8f62540c194aac980f2115af067bfe15a0734d7277a768d396b31"
dependencies = [
"cfg-if",
"libc",
2023-03-24 19:22:28 +00:00
"wasi 0.11.0+wasi-snapshot-preview1",
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "h2"
version = "0.3.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5be7b54589b581f624f566bf5d8eb2bab1db736c51528720b6bd36b96b55924d"
dependencies = [
"bytes",
"fnv",
"futures-core",
"futures-sink",
"futures-util",
"http",
"indexmap",
"slab",
"tokio",
"tokio-util",
"tracing",
]
[[package]]
name = "hashbrown"
version = "0.12.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8a9ee70c43aaf417c914396645a0fa852624801b24ebb7ae78fe8272889ac888"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
dependencies = [
"ahash",
]
[[package]]
name = "hashlink"
version = "0.8.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "69fe1fcf8b4278d860ad0548329f892a3631fb63f82574df68275f34cdbe0ffa"
dependencies = [
"hashbrown",
]
[[package]]
name = "heck"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "95505c38b4572b2d910cecb0281560f54b440a19336cbbcb27bf6ce6adc6f5a8"
dependencies = [
"unicode-segmentation",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "hermit-abi"
version = "0.2.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ee512640fe35acbfb4bb779db6f0d80704c2cacfa2e39b601ef3e3f47d1ae4c7"
dependencies = [
"libc",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "hermit-abi"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fed44880c466736ef9a5c5b5facefb5ed0785676d0c02d612db14e54f0d84286"
[[package]]
name = "hex"
version = "0.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7f24254aa9a54b5c858eaee2f5bccdb46aaf0e486a595ed5fd8f86ba55232a70"
[[package]]
name = "hkdf"
version = "0.12.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "791a029f6b9fc27657f6f188ec6e5e43f6911f6f878e0dc5501396e09809d437"
dependencies = [
"hmac",
]
[[package]]
name = "hmac"
version = "0.12.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6c49c37c09c17a53d937dfbb742eb3a961d65a994e6bcdcf37e7399d0cc8ab5e"
dependencies = [
"digest",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "http"
version = "0.2.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bd6effc99afb63425aff9b05836f029929e345a6148a14b7ecd5ab67af944482"
dependencies = [
"bytes",
"fnv",
"itoa",
]
[[package]]
name = "http-body"
version = "0.4.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d5f38f16d184e36f2408a55281cd658ecbd3ca05cce6d6510a176eca393e26d1"
dependencies = [
"bytes",
"http",
"pin-project-lite",
]
Added Telemetry: Logging and Analytics In Chapter 4, Palmieri focuses on logging and telemetry. Axum is very different from Actix, and my first foray into trying to understand it led me to [Tower](https://docs.rs/tower/latest/tower/), the Rust community's de-facto standards for modular networking development and design. I completely short-circuited much of what the book recommended and, instead, just went with the most basic implementation possible. I added the tracing libraries as recommended by the Axum developers, and then implemented the first level of tracing as recommended by Tower: ``` sh $ cargo add --features tower_http/trace,tracing tower tower_http tracing tracing_subscriber ``` And then I updated the app startup code to include it: ``` rust pub async fn app(configuration: &Settings) -> Router { tracing_subscriber::fmt::init(); let pool = PgPoolOptions::new() .max_connections(50) .connect(&configuration.database.url()) .await .expect("could not connect to database_url"); routes().layer(Extension(pool)).layer(TraceLayer::new_for_http()) } ``` That is literally all that was needed. And the output is: ``` plaintext 2023-03-25T16:49:06.385563Z DEBUG request{method=GET uri=/ version=HTTP/1.1}: tower_http::trace::on_request: started processing request 2023-03-25T16:49:06.386270Z DEBUG request{method=GET uri=/ version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=0 ms status=200 ``` That's not great logging, but it's a start. As I understand it, `tracing_subscriber::fmt::init()` initializes the formatter, but I'm confused as to where this is saved or stored, since it seems to be... nowhere. The deeper Rust gets, the wilder it seems. What I did manage was to create, [as recommended by Chris Allen](https://bitemyapp.com/blog/notes-on-zero2prod-rust/), a very simple Layer that shoves a new object into the collection of data being passed around by the request. That object contains a unique UUID for the session being processed. Since Tokio is a multi-threaded system, having a UUID allows us to trace each individual request from beginning to end... provided I've hooked up by handlers just right. I learned most of this by reading the [Axum Session source code](https://docs.rs/axum-sessions/latest/src/axum_sessions/session.rs.html), which implements something much more complex. Since we're at a deeper level of the service handling I need a function takes a Request and returns a Response, and in the middle inserts a SessionId into the Request passed in; by giving the type a name any handlers can now find and use that SessionId: ``` rust /// In file `session_id.rs` pub struct SessionId(pub Uuid); pub async fn session_id<B>(mut req: Request<B>, next: Next<B>) -> Result<Response, StatusCode> { req.extensions_mut().insert(SessionId(Uuid::new_v4())); Ok(next.run(req).await) } ``` With that, I now need to add it to the layers initialized with the app object: ```rust /// In lib.rs:pub async fn app()`: routes() .layer(Extension(pool)) .layer(TraceLayer::new_for_http()) .layer(middleware::from_fn(session_id::session_id)) ``` And with that, the SessionId is available. Since it's the outermost layer, it can now be used by anything deeper in. Let's add it to the `subscribe` function: ``` rust /// In routes/subscribe.rs/subscribe() pub(crate) async fn subscribe( Extension(session): Extension<SessionId>, Extension(pool): Extension<PgPool>, payload: Option<Form<NewSubscription>>, ) -> Result<(StatusCode, ()), ZTPError> { if let Some(payload) = payload { // Multi-line strings in Rust. Ugly. Would have preferred a macro. let sql = r#"INSERT INTO subscriptions (id, email, name, subscribed_at) VALUES ($1, $2, $3, $4);"#.to_string(); let subscription: Subscription = (&(payload.0)).into(); tracing::info!( "request_id {} - Adding '{}' as a new subscriber.", session.0.to_string(), subscription.name ); // ... ``` And with that, every Request now has a strong ID associated with it: ``` plaintext 2023-03-26T22:19:23.305421Z INFO ztp::routes::subscribe: request_id d0f4a6e7-de0d-48bc-902b-713901c1d63b - Adding 'Elf M. Sternberg' as a new subscriber. ``` That's a very noisy trace; I'd like to start knocking it down to something more like a responsible log, or give me permission to format it the way I like. I'm also getting incredibly noisy messages from the `sqlx::query` call, including the text of the SQL template (the `let sql = ...` line above), which I really don't need every time someone makes a request, and is horribly formatted for principled analytics. Configuring it to return JSON turned out to be easy, although my first pass puzzled me. I had to turn `json` formatting on as a feature: ``` sh $ cargo add --features=json tracing_subscriber ``` And then it was possible to configure the format: ``` rust // in lib.rs:app() // ... let format = tracing_subscriber::fmt::format() .with_level(false) // don't include levels in formatted output .with_thread_names(true) .json(); // include the name of the current thread tracing_subscriber::fmt().event_format(format).init(); // ... ``` ``` json { "timestamp":"2023-03-26T22:53:13.091366Z", "fields": { "message":"request_id 479014e2-5f13-4e12-8401-34d8f8bf1a18 - " "Adding 'Elf M. Sternberg' as a new subscriber."}, "target":"ztp::routes::subscribe", "threadName":"tokio-runtime-worker" } ``` This pretty much concludes my week-long foray into Palmieri's book; I'm not going to worry too much about the deployment stuff, since that's part of my daytime job and I'm not interesting in going over it again. Overall, this was an excellent book for teaching me many of the basics, and provides a really good introduction into the way application servers can be written in Rust. I disagree with the premise that "the language doesn't mean anything to the outcome," as I've heard some people say, nor do I think using Rust is some kind of badge of honor. Instead, I think it's a mark of a responsible developer, one who can produce code that works well the first time, and with some hard thinking about how types work (and some heavy-duty exposure to Haskell), Rust development can be your first thought, not your "I need speed!" thought, when developing HTTP-based application servers.
2023-03-26 23:03:24 +00:00
[[package]]
name = "http-range-header"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0bfe8eed0a9285ef776bb792479ea3834e8b94e13d615c2f66d03dd50a435a29"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "httparse"
version = "1.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d897f394bad6a705d5f4104762e116a75639e470d80901eed05a860a95cb1904"
[[package]]
name = "httpdate"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c4a1e36c821dbe04574f602848a19f742f4fb3c98d40449f11bcad18d6b17421"
[[package]]
name = "hyper"
version = "0.14.25"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cc5e554ff619822309ffd57d8734d77cd5ce6238bc956f037ea06c58238c9899"
dependencies = [
"bytes",
"futures-channel",
"futures-core",
"futures-util",
"h2",
"http",
"http-body",
"httparse",
"httpdate",
"itoa",
"pin-project-lite",
"socket2",
"tokio",
"tower-service",
"tracing",
"want",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "iana-time-zone"
version = "0.1.54"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0c17cc76786e99f8d2f055c11159e7f0091c42474dcc3189fbab96072e873e6d"
dependencies = [
"android_system_properties",
"core-foundation-sys",
"iana-time-zone-haiku",
"js-sys",
"wasm-bindgen",
"windows",
]
[[package]]
name = "iana-time-zone-haiku"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0703ae284fc167426161c2e3f1da3ea71d94b21bedbcc9494e92b28e334e3dca"
dependencies = [
"cxx",
"cxx-build",
]
[[package]]
name = "idna"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e14ddfc70884202db2244c223200c204c2bda1bc6e0998d11b5e024d657209e6"
dependencies = [
"unicode-bidi",
"unicode-normalization",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "indexmap"
version = "1.9.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1885e79c1fc4b10f0e172c475f458b7f7b93061064d98c3293e98c5ba0c8b399"
dependencies = [
"autocfg",
"hashbrown",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "instant"
version = "0.1.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7a5bbe824c507c5da5956355e86a746d82e0e1464f65d862cc5e71da70e94b2c"
dependencies = [
"cfg-if",
]
[[package]]
name = "io-lifetimes"
version = "1.0.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09270fd4fa1111bc614ed2246c7ef56239a3063d5be0d1ec3b589c505d400aeb"
dependencies = [
"hermit-abi 0.3.1",
"libc",
"windows-sys 0.45.0",
]
[[package]]
name = "itertools"
version = "0.10.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b0fd2260e829bddf4cb6ea802289de2f86d6a7a690192fbe91b3f46e0f2c8473"
dependencies = [
"either",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "itoa"
version = "1.0.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "453ad9f582a441959e5f0d088b02ce04cfe8d51a8eaf077f12ac6d3e94164ca6"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "js-sys"
version = "0.3.61"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "445dde2150c55e483f3d8416706b97ec8e8237c307e5b7b4b8dd15e6af2a0730"
dependencies = [
"wasm-bindgen",
]
[[package]]
name = "json5"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "96b0db21af676c1ce64250b5f40f3ce2cf27e4e47cb91ed91eb6fe9350b430c1"
dependencies = [
"pest",
"pest_derive",
"serde",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "lazy_static"
version = "1.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e2abad23fbc42b3700f2f279844dc832adb2b2eb069b2df918f455c4e18cc646"
[[package]]
name = "libc"
version = "0.2.140"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "99227334921fae1a979cf0bfdfcc6b3e5ce376ef57e16fb6fb3ea2ed6095f80c"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "link-cplusplus"
version = "1.0.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ecd207c9c713c34f95a097a5b029ac2ce6010530c7b49d7fea24d977dede04f5"
dependencies = [
"cc",
]
[[package]]
name = "linked-hash-map"
version = "0.5.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0717cef1bc8b636c6e1c1bbdefc09e6322da8a9321966e8928ef80d20f7f770f"
[[package]]
name = "linux-raw-sys"
version = "0.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f051f77a7c8e6957c0696eac88f26b0117e54f52d3fc682ab19397a8812846a4"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "lock_api"
version = "0.4.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "435011366fe56583b16cf956f9df0095b405b82d76425bc8981c0e22e60ec4df"
dependencies = [
"autocfg",
"scopeguard",
]
[[package]]
name = "log"
version = "0.4.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "abb12e687cfb44aa40f41fc3978ef76448f9b6038cad6aef4259d3c095a2382e"
dependencies = [
"cfg-if",
]
[[package]]
name = "matchit"
version = "0.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b87248edafb776e59e6ee64a79086f65890d3510f2c656c000bf2a7e8a0aea40"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "md-5"
version = "0.10.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6365506850d44bff6e2fbcb5176cf63650e48bd45ef2fe2665ae1570e0f4b9ca"
dependencies = [
"digest",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "memchr"
version = "2.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2dffe52ecf27772e601905b7522cb4ef790d2cc203488bbd0e2fe85fcb74566d"
[[package]]
name = "mime"
version = "0.3.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6877bb514081ee2a7ff5ef9de3281f14a4dd4bceac4c09388074a6b5df8a139a"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "minimal-lexical"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "68354c5c6bd36d73ff3feceb05efa59b6acb7626617f4962be322a825e61f79a"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "mio"
version = "0.8.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5b9d9a46eff5b4ff64b45a9e316a6d1e0bc719ef429cbec4dc630684212bfdf9"
dependencies = [
"libc",
"log",
2023-03-24 19:22:28 +00:00
"wasi 0.11.0+wasi-snapshot-preview1",
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
"windows-sys 0.45.0",
]
[[package]]
name = "native-tls"
version = "0.2.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "07226173c32f2926027b63cce4bcd8076c3552846cbe7925f3aaffeac0a3b92e"
dependencies = [
"lazy_static",
"libc",
"log",
"openssl",
"openssl-probe",
"openssl-sys",
"schannel",
"security-framework",
"security-framework-sys",
"tempfile",
]
[[package]]
name = "nom"
version = "7.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d273983c5a657a70a3e8f2a01329822f3b8c8172b73826411a55751e404a0a4a"
dependencies = [
"memchr",
"minimal-lexical",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
]
[[package]]
name = "nu-ansi-term"
version = "0.46.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "77a8165726e8236064dbb45459242600304b42a5ea24ee2948e18e023bf7ba84"
dependencies = [
"overload",
"winapi",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "num-integer"
version = "0.1.45"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "225d3389fb3509a24c93f5c29eb6bde2586b98d9f016636dff58d7c6f7569cd9"
dependencies = [
"autocfg",
"num-traits",
]
[[package]]
name = "num-traits"
version = "0.2.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "578ede34cf02f8924ab9447f50c28075b4d3e5b269972345e7e0372b38c6cdcd"
dependencies = [
"autocfg",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "num_cpus"
version = "1.15.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0fac9e2da13b5eb447a6ce3d392f23a29d8694bff781bf03a16cd9ac8697593b"
dependencies = [
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
"hermit-abi 0.2.6",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
"libc",
]
[[package]]
name = "once_cell"
version = "1.17.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b7e5500299e16ebb147ae15a00a942af264cf3688f47923b8fc2cd5858f23ad3"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "openssl"
version = "0.10.47"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d8b277f87dacc05a6b709965d1cbafac4649d6ce9f3ce9ceb88508b5666dfec9"
dependencies = [
"bitflags",
"cfg-if",
"foreign-types",
"libc",
"once_cell",
"openssl-macros",
"openssl-sys",
]
[[package]]
name = "openssl-macros"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b501e44f11665960c7e7fcf062c7d96a14ade4aa98116c004b2e37b5be7d736c"
dependencies = [
"proc-macro2",
"quote",
"syn 1.0.109",
]
[[package]]
name = "openssl-probe"
version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ff011a302c396a5197692431fc1948019154afc178baf7d8e37367442a4601cf"
[[package]]
name = "openssl-sys"
version = "0.9.82"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a95792af3c4e0153c3914df2261bedd30a98476f94dc892b67dfe1d89d433a04"
dependencies = [
"autocfg",
"cc",
"libc",
"pkg-config",
"vcpkg",
]
[[package]]
name = "ordered-multimap"
version = "0.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ccd746e37177e1711c20dd619a1620f34f5c8b569c53590a72dedd5344d8924a"
dependencies = [
"dlv-list",
"hashbrown",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "overload"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b15813163c1d831bf4a13c3610c05c0d03b39feb07f7e09fa234dac9b15aaf39"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "parking_lot"
version = "0.11.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7d17b78036a60663b797adeaee46f5c9dfebb86948d1255007a1d6be0271ff99"
dependencies = [
"instant",
"lock_api",
"parking_lot_core 0.8.6",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "parking_lot"
version = "0.12.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3742b2c103b9f06bc9fff0a37ff4912935851bee6d36f3c02bcc755bcfec228f"
dependencies = [
"lock_api",
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
"parking_lot_core 0.9.7",
]
[[package]]
name = "parking_lot_core"
version = "0.8.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "60a2cfe6f0ad2bfc16aefa463b497d5c7a5ecd44a23efa72aa342d90177356dc"
dependencies = [
"cfg-if",
"instant",
"libc",
"redox_syscall",
"smallvec",
"winapi",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
]
[[package]]
name = "parking_lot_core"
version = "0.9.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9069cbb9f99e3a5083476ccb29ceb1de18b9118cafa53e90c9551235de2b9521"
dependencies = [
"cfg-if",
"libc",
"redox_syscall",
"smallvec",
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
"windows-sys 0.45.0",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "paste"
version = "1.0.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9f746c4065a8fa3fe23974dd82f15431cc8d40779821001404d10d2e79ca7d79"
[[package]]
name = "pathdiff"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8835116a5c179084a830efb3adc117ab007512b535bc1a21c991d3b32a6b44dd"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "percent-encoding"
version = "2.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "478c572c3d73181ff3c2539045f6eb99e5491218eae919370993b890cdbdd98e"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "pest"
version = "2.5.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8cbd939b234e95d72bc393d51788aec68aeeb5d51e748ca08ff3aad58cb722f7"
dependencies = [
"thiserror",
"ucd-trie",
]
[[package]]
name = "pest_derive"
version = "2.5.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a81186863f3d0a27340815be8f2078dd8050b14cd71913db9fbda795e5f707d7"
dependencies = [
"pest",
"pest_generator",
]
[[package]]
name = "pest_generator"
version = "2.5.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "75a1ef20bf3193c15ac345acb32e26b3dc3223aff4d77ae4fc5359567683796b"
dependencies = [
"pest",
"pest_meta",
"proc-macro2",
"quote",
"syn 1.0.109",
]
[[package]]
name = "pest_meta"
version = "2.5.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5e3b284b1f13a20dc5ebc90aff59a51b8d7137c221131b52a7260c08cbc1cc80"
dependencies = [
"once_cell",
"pest",
"sha2",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "pin-project"
version = "1.0.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ad29a609b6bcd67fee905812e544992d216af9d755757c05ed2d0e15a74c6ecc"
dependencies = [
"pin-project-internal",
]
[[package]]
name = "pin-project-internal"
version = "1.0.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "069bdb1e05adc7a8990dce9cc75370895fbe4e3d58b9b73bf1aee56359344a55"
dependencies = [
"proc-macro2",
"quote",
"syn 1.0.109",
]
[[package]]
name = "pin-project-lite"
version = "0.2.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e0a7ae3ac2f1173085d398531c705756c94a4c56843785df85a60c1a0afac116"
[[package]]
name = "pin-utils"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8b870d8c151b6f2fb93e84a13146138f05d02ed11c7e7c54f8826aaaf7c9f184"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "pkg-config"
version = "0.3.26"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6ac9a59f73473f1b8d852421e59e64809f025994837ef743615c6d0c5b305160"
[[package]]
name = "ppv-lite86"
version = "0.2.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5b40af805b3121feab8a3c29f04d8ad262fa8e0561883e7653e024ae4479e6de"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "proc-macro2"
version = "1.0.52"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1d0e1ae9e836cc3beddd63db0df682593d7e2d3d891ae8c9083d2113e1744224"
dependencies = [
"unicode-ident",
]
[[package]]
name = "quote"
version = "1.0.26"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4424af4bf778aae2051a77b60283332f386554255d722233d09fbfc7e30da2fc"
dependencies = [
"proc-macro2",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "rand"
version = "0.8.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34af8d1a0e25924bc5b7c43c079c942339d8f0a8b57c39049bef581b46327404"
dependencies = [
"libc",
"rand_chacha",
"rand_core",
]
[[package]]
name = "rand_chacha"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e6c10a63a0fa32252be49d21e7709d4d4baf8d231c2dbce1eaa8141b9b127d88"
dependencies = [
"ppv-lite86",
"rand_core",
]
[[package]]
name = "rand_core"
version = "0.6.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ec0be4795e2f6a28069bec0b5ff3e2ac9bafc99e6a9a7dc3547996c5c816922c"
dependencies = [
"getrandom",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "redox_syscall"
version = "0.2.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fb5a58c1855b4b6819d59012155603f0b22ad30cad752600aadfcb695265519a"
dependencies = [
"bitflags",
]
[[package]]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
name = "redox_users"
version = "0.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b033d837a7cf162d7993aded9304e30a83213c648b6e389db233191f891e5c2b"
dependencies = [
"getrandom",
"redox_syscall",
"thiserror",
]
[[package]]
name = "ron"
version = "0.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "88073939a61e5b7680558e6be56b419e208420c2adb92be54921fa6b72283f1a"
dependencies = [
"base64",
"bitflags",
"serde",
]
[[package]]
name = "rust-ini"
version = "0.18.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f6d5f2436026b4f6e79dc829837d467cc7e9a55ee40e750d716713540715a2df"
dependencies = [
"cfg-if",
"ordered-multimap",
]
[[package]]
name = "rustix"
version = "0.36.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "db4165c9963ab29e422d6c26fbc1d37f15bace6b2810221f9d925023480fcf0e"
dependencies = [
"bitflags",
"errno",
"io-lifetimes",
"libc",
"linux-raw-sys",
"windows-sys 0.45.0",
]
[[package]]
name = "rustversion"
version = "1.0.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4f3208ce4d8448b3f3e7d168a73f5e0c43a61e32930de3bceeccedb388b6bf06"
[[package]]
name = "ryu"
version = "1.0.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f91339c0467de62360649f8d3e185ca8de4224ff281f66000de5eb2a77a79041"
[[package]]
name = "schannel"
version = "0.1.21"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "713cfb06c7059f3588fb8044c0fad1d09e3c01d225e25b9220dbfdcf16dbb1b3"
dependencies = [
"windows-sys 0.42.0",
]
[[package]]
name = "scopeguard"
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d29ab0c6d3fc0ee92fe66e2d99f700eab17a8d57d1c1d3b748380fb20baa78cd"
[[package]]
name = "scratch"
version = "1.0.5"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
source = "registry+https://github.com/rust-lang/crates.io-index"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
checksum = "1792db035ce95be60c3f8853017b3999209281c24e2ba5bc8e59bf97a0c590c1"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
name = "security-framework"
version = "2.8.2"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
source = "registry+https://github.com/rust-lang/crates.io-index"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
checksum = "a332be01508d814fed64bf28f798a146d73792121129962fdf335bb3c49a4254"
dependencies = [
"bitflags",
"core-foundation",
"core-foundation-sys",
"libc",
"security-framework-sys",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
name = "security-framework-sys"
version = "2.8.0"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
source = "registry+https://github.com/rust-lang/crates.io-index"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
checksum = "31c9bb296072e961fcbd8853511dd39c2d8be2deb1e17c6860b1d30732b323b4"
dependencies = [
"core-foundation-sys",
"libc",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "serde"
version = "1.0.158"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "771d4d9c4163ee138805e12c710dd365e4f44be8be0503cb1bb9eb989425d9c9"
Pre-commit checks and test refactorings. Re-reading the text, I made a number of changes. The first is that, while it is nice that Rust allows us to have unit tests in the file whose functionality we're testing, it's also nice to have the tests somewhere separate, and to have the tests be a little more modular. In the `./tests` folder, you can now see the same `health_check` test as the original, but in an isolated and cleaned-up form. Most importantly, the server startup code is now in its own function, with a correct return type that includes a handle to the spawned thread and the address on which that server is listening; tests can be run in parallel on many different ports and a lot of code duplication is eliminated. ``` rust type NullHandle = JoinHandle<()>; async fn spawn_server() -> (SocketAddr, NullHandle) { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>().unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); let handle: NullHandle = tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap() .serve(app().into_make_service()) .await .unwrap(); }); (addr, handle) } ``` It is also possible now to add new tests in a straightforward manner. The Hyper API is not that much different from the Actix request API, and the Axum extractors seem to be straightforward. I suspect that what I'm looking at here with the handle is the idea that, when it goes out of scope, it calls a d In the introduction I said I was going to be neglecting CI/CD, since I'm a solo developer. That's true, but I do like my guardrails. I like not being able to commit garbage to the repository. So I'm going to add some checks, using [Pre-Commit](https://pre-commit.com/). Pre-Commit is a Python program, so we'll start by installing it. I'm using a local Python environment kickstarted with [Pyenv](https://github.com/pyenv/pyenv). ``` sh $ pip install pre-commit ``` And inside your project, in the project root, you hook it up with the following commands: ``` sh $ pre-commit install $ pre-commit sample-config > .pre-commit-config.yaml ``` I'm going with the default from the rust pre-commit collection, so my `.pre-commit-config.yaml` file looks like this: ``` yaml repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v3.1.0 hooks: - id: check-byte-order-marker - id: check-case-conflict - id: check-merge-conflict - id: check-symlinks - id: check-yaml - id: end-of-file-fixer - id: mixed-line-ending - id: trailing-whitespace - repo: https://github.com/pre-commit/pre-commit rev: v2.5.1 hooks: - id: validate_manifest - repo: https://github.com/doublify/pre-commit-rust rev: master hooks: - id: fmt - id: cargo-check - id: clippy ``` ... and with that, every time I try to commit my code, it will not let me until these tests pass. And I *like* that level of discipline. This is low-level validation; it won't catch if I put addition where I meant subtraction, or if I have a comparison going in the wrong direction, but at least the basics are handled and, more importantly, the formatting and styling is consistent throughout all of my code.
2023-03-22 00:52:44 +00:00
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.158"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e801c1712f48475582b7696ac71e0ca34ebb30e09338425384269d9717c62cad"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.4",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "serde_json"
version = "1.0.94"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1c533a59c9d8a93a09c6ab31f0fd5e5f4dd1b8fc9434804029839884765d04ea"
dependencies = [
"itoa",
"ryu",
"serde",
]
[[package]]
name = "serde_path_to_error"
version = "0.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "db0969fff533976baadd92e08b1d102c5a3d8a8049eadfd69d4d1e3c5b2ed189"
dependencies = [
"serde",
]
[[package]]
name = "serde_urlencoded"
version = "0.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d3491c14715ca2294c4d6a88f15e84739788c1d030eed8c110436aafdaa2f3fd"
dependencies = [
"form_urlencoded",
"itoa",
"ryu",
"serde",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "sha1"
version = "0.10.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f04293dc80c3993519f2d7f6f511707ee7094fe0c6d3406feb330cdb3540eba3"
dependencies = [
"cfg-if",
"cpufeatures",
"digest",
]
[[package]]
name = "sha2"
version = "0.10.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "82e6b795fe2e3b1e845bafcb27aa35405c4d47cdfc92af5fc8d3002f76cebdc0"
dependencies = [
"cfg-if",
"cpufeatures",
"digest",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "sharded-slab"
version = "0.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "900fba806f70c630b0a382d0d825e17a0f19fcd059a2ade1ff237bcddf446b31"
dependencies = [
"lazy_static",
]
[[package]]
name = "signal-hook-registry"
version = "1.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d8229b473baa5980ac72ef434c4415e70c4b5e71b423043adb4ba059f89c99a1"
dependencies = [
"libc",
]
[[package]]
name = "slab"
version = "0.4.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6528351c9bc8ab22353f9d776db39a20288e8d6c37ef8cfe3317cf875eecfc2d"
dependencies = [
"autocfg",
]
[[package]]
name = "smallvec"
version = "1.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a507befe795404456341dfab10cef66ead4c041f62b8b11bbb92bffe5d0953e0"
[[package]]
name = "socket2"
version = "0.4.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "64a4a911eed85daf18834cfaa86a79b7d266ff93ff5ba14005426219480ed662"
dependencies = [
"libc",
"winapi",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "sqlformat"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0c12bc9199d1db8234678b7051747c07f517cdcf019262d1847b94ec8b1aee3e"
dependencies = [
"itertools",
"nom",
"unicode_categories",
]
[[package]]
name = "sqlx"
version = "0.6.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f8de3b03a925878ed54a954f621e64bf55a3c1bd29652d0d1a17830405350188"
dependencies = [
"sqlx-core",
"sqlx-macros",
]
[[package]]
name = "sqlx-core"
version = "0.6.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fa8241483a83a3f33aa5fff7e7d9def398ff9990b2752b6c6112b83c6d246029"
dependencies = [
"ahash",
"atoi",
"base64",
"bitflags",
"byteorder",
"bytes",
"chrono",
"crc",
"crossbeam-queue",
"dirs",
"dotenvy",
"either",
"event-listener",
"futures-channel",
"futures-core",
"futures-intrusive",
"futures-util",
"hashlink",
"hex",
"hkdf",
"hmac",
"indexmap",
"itoa",
"libc",
"log",
"md-5",
"memchr",
"once_cell",
"paste",
"percent-encoding",
"rand",
"serde",
"serde_json",
"sha1",
"sha2",
"smallvec",
"sqlformat",
"sqlx-rt",
"stringprep",
"thiserror",
"tokio-stream",
"url",
"uuid",
"whoami",
]
[[package]]
name = "sqlx-macros"
version = "0.6.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9966e64ae989e7e575b19d7265cb79d7fc3cbbdf179835cb0d716f294c2049c9"
dependencies = [
"dotenvy",
"either",
"heck",
"once_cell",
"proc-macro2",
"quote",
"sha2",
"sqlx-core",
"sqlx-rt",
"syn 1.0.109",
"url",
]
[[package]]
name = "sqlx-rt"
version = "0.6.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "804d3f245f894e61b1e6263c84b23ca675d96753b5abfd5cc8597d86806e8024"
dependencies = [
"native-tls",
"once_cell",
"tokio",
"tokio-native-tls",
]
[[package]]
name = "stringprep"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8ee348cb74b87454fff4b551cbf727025810a004f88aeacae7f85b87f4e9a1c1"
dependencies = [
"unicode-bidi",
"unicode-normalization",
]
[[package]]
name = "subtle"
version = "2.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6bdef32e8150c2a081110b42772ffe7d7c9032b606bc226c8260fd97e0976601"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "syn"
version = "1.0.109"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "72b64191b275b66ffe2469e8af2c1cfe3bafa67b529ead792a6d0160888b4237"
dependencies = [
"proc-macro2",
"quote",
"unicode-ident",
]
[[package]]
name = "syn"
version = "2.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2c622ae390c9302e214c31013517c2061ecb2699935882c60a9b37f82f8625ae"
dependencies = [
"proc-macro2",
"quote",
"unicode-ident",
]
[[package]]
name = "sync_wrapper"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2047c6ded9c721764247e62cd3b03c09ffc529b2ba5b10ec482ae507a4a70160"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "tempfile"
version = "3.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "af18f7ae1acd354b992402e9ec5864359d693cd8a79dcbef59f76891701c1e95"
dependencies = [
"cfg-if",
"fastrand",
"redox_syscall",
"rustix",
"windows-sys 0.42.0",
]
[[package]]
name = "termcolor"
version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "be55cf8942feac5c765c2c993422806843c9a9a45d4d5c407ad6dd2ea95eb9b6"
dependencies = [
"winapi-util",
]
[[package]]
name = "thiserror"
version = "1.0.40"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "978c9a314bd8dc99be594bc3c175faaa9794be04a5a5e153caba6915336cebac"
dependencies = [
"thiserror-impl",
]
[[package]]
name = "thiserror-impl"
version = "1.0.40"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f9456a42c5b0d803c8cd86e73dd7cc9edd429499f37a3550d286d5e86720569f"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.4",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "thread_local"
version = "1.1.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3fdd6f064ccff2d6567adcb3873ca630700f00b5ad3f060c25b5dcfd9a4ce152"
dependencies = [
"cfg-if",
"once_cell",
]
2023-03-24 19:22:28 +00:00
[[package]]
name = "time"
version = "0.1.45"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1b797afad3f312d1c66a56d11d0316f916356d11bd158fbc6ca6389ff6bf805a"
dependencies = [
"libc",
"wasi 0.10.0+wasi-snapshot-preview1",
"winapi",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "tinyvec"
version = "1.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "87cc5ceb3875bb20c2890005a4e226a4651264a5c75edb2421b52861a0a0cb50"
dependencies = [
"tinyvec_macros",
]
[[package]]
name = "tinyvec_macros"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "tokio"
version = "1.26.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "03201d01c3c27a29c8a5cee5b55a93ddae1ccf6f08f65365c2c918f8c1b76f64"
dependencies = [
"autocfg",
"bytes",
"libc",
"memchr",
"mio",
"num_cpus",
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
"parking_lot 0.12.1",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
"pin-project-lite",
"signal-hook-registry",
"socket2",
"tokio-macros",
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
"windows-sys 0.45.0",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
]
[[package]]
name = "tokio-macros"
version = "1.8.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d266c00fde287f55d3f1c3e96c500c362a2b8c695076ec180f27918820bc6df8"
dependencies = [
"proc-macro2",
"quote",
"syn 1.0.109",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "tokio-native-tls"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bbae76ab933c85776efabc971569dd6119c580d8f5d448769dec1764bf796ef2"
dependencies = [
"native-tls",
"tokio",
]
[[package]]
name = "tokio-stream"
version = "0.1.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8fb52b74f05dbf495a8fba459fdc331812b96aa086d9eb78101fa0d4569c3313"
dependencies = [
"futures-core",
"pin-project-lite",
"tokio",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "tokio-util"
version = "0.7.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5427d89453009325de0d8f342c9490009f76e999cb7672d77e46267448f7e6b2"
dependencies = [
"bytes",
"futures-core",
"futures-sink",
"pin-project-lite",
"tokio",
"tracing",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "toml"
version = "0.5.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f4f7f0dd8d50a853a531c426359045b1998f04219d88799810762cd4ad314234"
dependencies = [
"serde",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "tower"
version = "0.4.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b8fa9be0de6cf49e536ce1851f987bd21a43b771b09473c3549a6c853db37c1c"
dependencies = [
"futures-core",
"futures-util",
"pin-project",
"pin-project-lite",
"tokio",
"tower-layer",
"tower-service",
"tracing",
]
Added Telemetry: Logging and Analytics In Chapter 4, Palmieri focuses on logging and telemetry. Axum is very different from Actix, and my first foray into trying to understand it led me to [Tower](https://docs.rs/tower/latest/tower/), the Rust community's de-facto standards for modular networking development and design. I completely short-circuited much of what the book recommended and, instead, just went with the most basic implementation possible. I added the tracing libraries as recommended by the Axum developers, and then implemented the first level of tracing as recommended by Tower: ``` sh $ cargo add --features tower_http/trace,tracing tower tower_http tracing tracing_subscriber ``` And then I updated the app startup code to include it: ``` rust pub async fn app(configuration: &Settings) -> Router { tracing_subscriber::fmt::init(); let pool = PgPoolOptions::new() .max_connections(50) .connect(&configuration.database.url()) .await .expect("could not connect to database_url"); routes().layer(Extension(pool)).layer(TraceLayer::new_for_http()) } ``` That is literally all that was needed. And the output is: ``` plaintext 2023-03-25T16:49:06.385563Z DEBUG request{method=GET uri=/ version=HTTP/1.1}: tower_http::trace::on_request: started processing request 2023-03-25T16:49:06.386270Z DEBUG request{method=GET uri=/ version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=0 ms status=200 ``` That's not great logging, but it's a start. As I understand it, `tracing_subscriber::fmt::init()` initializes the formatter, but I'm confused as to where this is saved or stored, since it seems to be... nowhere. The deeper Rust gets, the wilder it seems. What I did manage was to create, [as recommended by Chris Allen](https://bitemyapp.com/blog/notes-on-zero2prod-rust/), a very simple Layer that shoves a new object into the collection of data being passed around by the request. That object contains a unique UUID for the session being processed. Since Tokio is a multi-threaded system, having a UUID allows us to trace each individual request from beginning to end... provided I've hooked up by handlers just right. I learned most of this by reading the [Axum Session source code](https://docs.rs/axum-sessions/latest/src/axum_sessions/session.rs.html), which implements something much more complex. Since we're at a deeper level of the service handling I need a function takes a Request and returns a Response, and in the middle inserts a SessionId into the Request passed in; by giving the type a name any handlers can now find and use that SessionId: ``` rust /// In file `session_id.rs` pub struct SessionId(pub Uuid); pub async fn session_id<B>(mut req: Request<B>, next: Next<B>) -> Result<Response, StatusCode> { req.extensions_mut().insert(SessionId(Uuid::new_v4())); Ok(next.run(req).await) } ``` With that, I now need to add it to the layers initialized with the app object: ```rust /// In lib.rs:pub async fn app()`: routes() .layer(Extension(pool)) .layer(TraceLayer::new_for_http()) .layer(middleware::from_fn(session_id::session_id)) ``` And with that, the SessionId is available. Since it's the outermost layer, it can now be used by anything deeper in. Let's add it to the `subscribe` function: ``` rust /// In routes/subscribe.rs/subscribe() pub(crate) async fn subscribe( Extension(session): Extension<SessionId>, Extension(pool): Extension<PgPool>, payload: Option<Form<NewSubscription>>, ) -> Result<(StatusCode, ()), ZTPError> { if let Some(payload) = payload { // Multi-line strings in Rust. Ugly. Would have preferred a macro. let sql = r#"INSERT INTO subscriptions (id, email, name, subscribed_at) VALUES ($1, $2, $3, $4);"#.to_string(); let subscription: Subscription = (&(payload.0)).into(); tracing::info!( "request_id {} - Adding '{}' as a new subscriber.", session.0.to_string(), subscription.name ); // ... ``` And with that, every Request now has a strong ID associated with it: ``` plaintext 2023-03-26T22:19:23.305421Z INFO ztp::routes::subscribe: request_id d0f4a6e7-de0d-48bc-902b-713901c1d63b - Adding 'Elf M. Sternberg' as a new subscriber. ``` That's a very noisy trace; I'd like to start knocking it down to something more like a responsible log, or give me permission to format it the way I like. I'm also getting incredibly noisy messages from the `sqlx::query` call, including the text of the SQL template (the `let sql = ...` line above), which I really don't need every time someone makes a request, and is horribly formatted for principled analytics. Configuring it to return JSON turned out to be easy, although my first pass puzzled me. I had to turn `json` formatting on as a feature: ``` sh $ cargo add --features=json tracing_subscriber ``` And then it was possible to configure the format: ``` rust // in lib.rs:app() // ... let format = tracing_subscriber::fmt::format() .with_level(false) // don't include levels in formatted output .with_thread_names(true) .json(); // include the name of the current thread tracing_subscriber::fmt().event_format(format).init(); // ... ``` ``` json { "timestamp":"2023-03-26T22:53:13.091366Z", "fields": { "message":"request_id 479014e2-5f13-4e12-8401-34d8f8bf1a18 - " "Adding 'Elf M. Sternberg' as a new subscriber."}, "target":"ztp::routes::subscribe", "threadName":"tokio-runtime-worker" } ``` This pretty much concludes my week-long foray into Palmieri's book; I'm not going to worry too much about the deployment stuff, since that's part of my daytime job and I'm not interesting in going over it again. Overall, this was an excellent book for teaching me many of the basics, and provides a really good introduction into the way application servers can be written in Rust. I disagree with the premise that "the language doesn't mean anything to the outcome," as I've heard some people say, nor do I think using Rust is some kind of badge of honor. Instead, I think it's a mark of a responsible developer, one who can produce code that works well the first time, and with some hard thinking about how types work (and some heavy-duty exposure to Haskell), Rust development can be your first thought, not your "I need speed!" thought, when developing HTTP-based application servers.
2023-03-26 23:03:24 +00:00
[[package]]
name = "tower-http"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5d1d42a9b3f3ec46ba828e8d376aec14592ea199f70a06a548587ecd1c4ab658"
dependencies = [
"bitflags",
"bytes",
"futures-core",
"futures-util",
"http",
"http-body",
"http-range-header",
"pin-project-lite",
"tower-layer",
"tower-service",
"tracing",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "tower-layer"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c20c8dbed6283a09604c3e69b4b7eeb54e298b8a600d4d5ecb5ad39de609f1d0"
[[package]]
name = "tower-service"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b6bc1c9ce2b5135ac7f93c72918fc37feb872bdc6a5533a8b85eb4b86bfdae52"
[[package]]
name = "tracing"
version = "0.1.37"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8ce8c33a8d48bd45d624a6e523445fd21ec13d3653cd51f681abf67418f54eb8"
dependencies = [
"cfg-if",
"log",
"pin-project-lite",
"tracing-attributes",
"tracing-core",
]
[[package]]
name = "tracing-attributes"
version = "0.1.23"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4017f8f45139870ca7e672686113917c71c7a6e02d4924eda67186083c03081a"
dependencies = [
"proc-macro2",
"quote",
"syn 1.0.109",
]
[[package]]
name = "tracing-core"
version = "0.1.30"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "24eb03ba0eab1fd845050058ce5e616558e8f8d8fca633e6b163fe25c797213a"
dependencies = [
"once_cell",
"valuable",
]
[[package]]
name = "tracing-log"
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "78ddad33d2d10b1ed7eb9d1f518a5674713876e97e5bb9b7345a7984fbb4f922"
dependencies = [
"lazy_static",
"log",
"tracing-core",
]
Added Telemetry: Logging and Analytics In Chapter 4, Palmieri focuses on logging and telemetry. Axum is very different from Actix, and my first foray into trying to understand it led me to [Tower](https://docs.rs/tower/latest/tower/), the Rust community's de-facto standards for modular networking development and design. I completely short-circuited much of what the book recommended and, instead, just went with the most basic implementation possible. I added the tracing libraries as recommended by the Axum developers, and then implemented the first level of tracing as recommended by Tower: ``` sh $ cargo add --features tower_http/trace,tracing tower tower_http tracing tracing_subscriber ``` And then I updated the app startup code to include it: ``` rust pub async fn app(configuration: &Settings) -> Router { tracing_subscriber::fmt::init(); let pool = PgPoolOptions::new() .max_connections(50) .connect(&configuration.database.url()) .await .expect("could not connect to database_url"); routes().layer(Extension(pool)).layer(TraceLayer::new_for_http()) } ``` That is literally all that was needed. And the output is: ``` plaintext 2023-03-25T16:49:06.385563Z DEBUG request{method=GET uri=/ version=HTTP/1.1}: tower_http::trace::on_request: started processing request 2023-03-25T16:49:06.386270Z DEBUG request{method=GET uri=/ version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=0 ms status=200 ``` That's not great logging, but it's a start. As I understand it, `tracing_subscriber::fmt::init()` initializes the formatter, but I'm confused as to where this is saved or stored, since it seems to be... nowhere. The deeper Rust gets, the wilder it seems. What I did manage was to create, [as recommended by Chris Allen](https://bitemyapp.com/blog/notes-on-zero2prod-rust/), a very simple Layer that shoves a new object into the collection of data being passed around by the request. That object contains a unique UUID for the session being processed. Since Tokio is a multi-threaded system, having a UUID allows us to trace each individual request from beginning to end... provided I've hooked up by handlers just right. I learned most of this by reading the [Axum Session source code](https://docs.rs/axum-sessions/latest/src/axum_sessions/session.rs.html), which implements something much more complex. Since we're at a deeper level of the service handling I need a function takes a Request and returns a Response, and in the middle inserts a SessionId into the Request passed in; by giving the type a name any handlers can now find and use that SessionId: ``` rust /// In file `session_id.rs` pub struct SessionId(pub Uuid); pub async fn session_id<B>(mut req: Request<B>, next: Next<B>) -> Result<Response, StatusCode> { req.extensions_mut().insert(SessionId(Uuid::new_v4())); Ok(next.run(req).await) } ``` With that, I now need to add it to the layers initialized with the app object: ```rust /// In lib.rs:pub async fn app()`: routes() .layer(Extension(pool)) .layer(TraceLayer::new_for_http()) .layer(middleware::from_fn(session_id::session_id)) ``` And with that, the SessionId is available. Since it's the outermost layer, it can now be used by anything deeper in. Let's add it to the `subscribe` function: ``` rust /// In routes/subscribe.rs/subscribe() pub(crate) async fn subscribe( Extension(session): Extension<SessionId>, Extension(pool): Extension<PgPool>, payload: Option<Form<NewSubscription>>, ) -> Result<(StatusCode, ()), ZTPError> { if let Some(payload) = payload { // Multi-line strings in Rust. Ugly. Would have preferred a macro. let sql = r#"INSERT INTO subscriptions (id, email, name, subscribed_at) VALUES ($1, $2, $3, $4);"#.to_string(); let subscription: Subscription = (&(payload.0)).into(); tracing::info!( "request_id {} - Adding '{}' as a new subscriber.", session.0.to_string(), subscription.name ); // ... ``` And with that, every Request now has a strong ID associated with it: ``` plaintext 2023-03-26T22:19:23.305421Z INFO ztp::routes::subscribe: request_id d0f4a6e7-de0d-48bc-902b-713901c1d63b - Adding 'Elf M. Sternberg' as a new subscriber. ``` That's a very noisy trace; I'd like to start knocking it down to something more like a responsible log, or give me permission to format it the way I like. I'm also getting incredibly noisy messages from the `sqlx::query` call, including the text of the SQL template (the `let sql = ...` line above), which I really don't need every time someone makes a request, and is horribly formatted for principled analytics. Configuring it to return JSON turned out to be easy, although my first pass puzzled me. I had to turn `json` formatting on as a feature: ``` sh $ cargo add --features=json tracing_subscriber ``` And then it was possible to configure the format: ``` rust // in lib.rs:app() // ... let format = tracing_subscriber::fmt::format() .with_level(false) // don't include levels in formatted output .with_thread_names(true) .json(); // include the name of the current thread tracing_subscriber::fmt().event_format(format).init(); // ... ``` ``` json { "timestamp":"2023-03-26T22:53:13.091366Z", "fields": { "message":"request_id 479014e2-5f13-4e12-8401-34d8f8bf1a18 - " "Adding 'Elf M. Sternberg' as a new subscriber."}, "target":"ztp::routes::subscribe", "threadName":"tokio-runtime-worker" } ``` This pretty much concludes my week-long foray into Palmieri's book; I'm not going to worry too much about the deployment stuff, since that's part of my daytime job and I'm not interesting in going over it again. Overall, this was an excellent book for teaching me many of the basics, and provides a really good introduction into the way application servers can be written in Rust. I disagree with the premise that "the language doesn't mean anything to the outcome," as I've heard some people say, nor do I think using Rust is some kind of badge of honor. Instead, I think it's a mark of a responsible developer, one who can produce code that works well the first time, and with some hard thinking about how types work (and some heavy-duty exposure to Haskell), Rust development can be your first thought, not your "I need speed!" thought, when developing HTTP-based application servers.
2023-03-26 23:03:24 +00:00
[[package]]
name = "tracing-serde"
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bc6b213177105856957181934e4920de57730fc69bf42c37ee5bb664d406d9e1"
dependencies = [
"serde",
"tracing-core",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "tracing-subscriber"
version = "0.3.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a6176eae26dd70d0c919749377897b54a9276bd7061339665dd68777926b5a70"
dependencies = [
"nu-ansi-term",
Added Telemetry: Logging and Analytics In Chapter 4, Palmieri focuses on logging and telemetry. Axum is very different from Actix, and my first foray into trying to understand it led me to [Tower](https://docs.rs/tower/latest/tower/), the Rust community's de-facto standards for modular networking development and design. I completely short-circuited much of what the book recommended and, instead, just went with the most basic implementation possible. I added the tracing libraries as recommended by the Axum developers, and then implemented the first level of tracing as recommended by Tower: ``` sh $ cargo add --features tower_http/trace,tracing tower tower_http tracing tracing_subscriber ``` And then I updated the app startup code to include it: ``` rust pub async fn app(configuration: &Settings) -> Router { tracing_subscriber::fmt::init(); let pool = PgPoolOptions::new() .max_connections(50) .connect(&configuration.database.url()) .await .expect("could not connect to database_url"); routes().layer(Extension(pool)).layer(TraceLayer::new_for_http()) } ``` That is literally all that was needed. And the output is: ``` plaintext 2023-03-25T16:49:06.385563Z DEBUG request{method=GET uri=/ version=HTTP/1.1}: tower_http::trace::on_request: started processing request 2023-03-25T16:49:06.386270Z DEBUG request{method=GET uri=/ version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=0 ms status=200 ``` That's not great logging, but it's a start. As I understand it, `tracing_subscriber::fmt::init()` initializes the formatter, but I'm confused as to where this is saved or stored, since it seems to be... nowhere. The deeper Rust gets, the wilder it seems. What I did manage was to create, [as recommended by Chris Allen](https://bitemyapp.com/blog/notes-on-zero2prod-rust/), a very simple Layer that shoves a new object into the collection of data being passed around by the request. That object contains a unique UUID for the session being processed. Since Tokio is a multi-threaded system, having a UUID allows us to trace each individual request from beginning to end... provided I've hooked up by handlers just right. I learned most of this by reading the [Axum Session source code](https://docs.rs/axum-sessions/latest/src/axum_sessions/session.rs.html), which implements something much more complex. Since we're at a deeper level of the service handling I need a function takes a Request and returns a Response, and in the middle inserts a SessionId into the Request passed in; by giving the type a name any handlers can now find and use that SessionId: ``` rust /// In file `session_id.rs` pub struct SessionId(pub Uuid); pub async fn session_id<B>(mut req: Request<B>, next: Next<B>) -> Result<Response, StatusCode> { req.extensions_mut().insert(SessionId(Uuid::new_v4())); Ok(next.run(req).await) } ``` With that, I now need to add it to the layers initialized with the app object: ```rust /// In lib.rs:pub async fn app()`: routes() .layer(Extension(pool)) .layer(TraceLayer::new_for_http()) .layer(middleware::from_fn(session_id::session_id)) ``` And with that, the SessionId is available. Since it's the outermost layer, it can now be used by anything deeper in. Let's add it to the `subscribe` function: ``` rust /// In routes/subscribe.rs/subscribe() pub(crate) async fn subscribe( Extension(session): Extension<SessionId>, Extension(pool): Extension<PgPool>, payload: Option<Form<NewSubscription>>, ) -> Result<(StatusCode, ()), ZTPError> { if let Some(payload) = payload { // Multi-line strings in Rust. Ugly. Would have preferred a macro. let sql = r#"INSERT INTO subscriptions (id, email, name, subscribed_at) VALUES ($1, $2, $3, $4);"#.to_string(); let subscription: Subscription = (&(payload.0)).into(); tracing::info!( "request_id {} - Adding '{}' as a new subscriber.", session.0.to_string(), subscription.name ); // ... ``` And with that, every Request now has a strong ID associated with it: ``` plaintext 2023-03-26T22:19:23.305421Z INFO ztp::routes::subscribe: request_id d0f4a6e7-de0d-48bc-902b-713901c1d63b - Adding 'Elf M. Sternberg' as a new subscriber. ``` That's a very noisy trace; I'd like to start knocking it down to something more like a responsible log, or give me permission to format it the way I like. I'm also getting incredibly noisy messages from the `sqlx::query` call, including the text of the SQL template (the `let sql = ...` line above), which I really don't need every time someone makes a request, and is horribly formatted for principled analytics. Configuring it to return JSON turned out to be easy, although my first pass puzzled me. I had to turn `json` formatting on as a feature: ``` sh $ cargo add --features=json tracing_subscriber ``` And then it was possible to configure the format: ``` rust // in lib.rs:app() // ... let format = tracing_subscriber::fmt::format() .with_level(false) // don't include levels in formatted output .with_thread_names(true) .json(); // include the name of the current thread tracing_subscriber::fmt().event_format(format).init(); // ... ``` ``` json { "timestamp":"2023-03-26T22:53:13.091366Z", "fields": { "message":"request_id 479014e2-5f13-4e12-8401-34d8f8bf1a18 - " "Adding 'Elf M. Sternberg' as a new subscriber."}, "target":"ztp::routes::subscribe", "threadName":"tokio-runtime-worker" } ``` This pretty much concludes my week-long foray into Palmieri's book; I'm not going to worry too much about the deployment stuff, since that's part of my daytime job and I'm not interesting in going over it again. Overall, this was an excellent book for teaching me many of the basics, and provides a really good introduction into the way application servers can be written in Rust. I disagree with the premise that "the language doesn't mean anything to the outcome," as I've heard some people say, nor do I think using Rust is some kind of badge of honor. Instead, I think it's a mark of a responsible developer, one who can produce code that works well the first time, and with some hard thinking about how types work (and some heavy-duty exposure to Haskell), Rust development can be your first thought, not your "I need speed!" thought, when developing HTTP-based application servers.
2023-03-26 23:03:24 +00:00
"serde",
"serde_json",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
"sharded-slab",
"smallvec",
"thread_local",
"tracing-core",
"tracing-log",
Added Telemetry: Logging and Analytics In Chapter 4, Palmieri focuses on logging and telemetry. Axum is very different from Actix, and my first foray into trying to understand it led me to [Tower](https://docs.rs/tower/latest/tower/), the Rust community's de-facto standards for modular networking development and design. I completely short-circuited much of what the book recommended and, instead, just went with the most basic implementation possible. I added the tracing libraries as recommended by the Axum developers, and then implemented the first level of tracing as recommended by Tower: ``` sh $ cargo add --features tower_http/trace,tracing tower tower_http tracing tracing_subscriber ``` And then I updated the app startup code to include it: ``` rust pub async fn app(configuration: &Settings) -> Router { tracing_subscriber::fmt::init(); let pool = PgPoolOptions::new() .max_connections(50) .connect(&configuration.database.url()) .await .expect("could not connect to database_url"); routes().layer(Extension(pool)).layer(TraceLayer::new_for_http()) } ``` That is literally all that was needed. And the output is: ``` plaintext 2023-03-25T16:49:06.385563Z DEBUG request{method=GET uri=/ version=HTTP/1.1}: tower_http::trace::on_request: started processing request 2023-03-25T16:49:06.386270Z DEBUG request{method=GET uri=/ version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=0 ms status=200 ``` That's not great logging, but it's a start. As I understand it, `tracing_subscriber::fmt::init()` initializes the formatter, but I'm confused as to where this is saved or stored, since it seems to be... nowhere. The deeper Rust gets, the wilder it seems. What I did manage was to create, [as recommended by Chris Allen](https://bitemyapp.com/blog/notes-on-zero2prod-rust/), a very simple Layer that shoves a new object into the collection of data being passed around by the request. That object contains a unique UUID for the session being processed. Since Tokio is a multi-threaded system, having a UUID allows us to trace each individual request from beginning to end... provided I've hooked up by handlers just right. I learned most of this by reading the [Axum Session source code](https://docs.rs/axum-sessions/latest/src/axum_sessions/session.rs.html), which implements something much more complex. Since we're at a deeper level of the service handling I need a function takes a Request and returns a Response, and in the middle inserts a SessionId into the Request passed in; by giving the type a name any handlers can now find and use that SessionId: ``` rust /// In file `session_id.rs` pub struct SessionId(pub Uuid); pub async fn session_id<B>(mut req: Request<B>, next: Next<B>) -> Result<Response, StatusCode> { req.extensions_mut().insert(SessionId(Uuid::new_v4())); Ok(next.run(req).await) } ``` With that, I now need to add it to the layers initialized with the app object: ```rust /// In lib.rs:pub async fn app()`: routes() .layer(Extension(pool)) .layer(TraceLayer::new_for_http()) .layer(middleware::from_fn(session_id::session_id)) ``` And with that, the SessionId is available. Since it's the outermost layer, it can now be used by anything deeper in. Let's add it to the `subscribe` function: ``` rust /// In routes/subscribe.rs/subscribe() pub(crate) async fn subscribe( Extension(session): Extension<SessionId>, Extension(pool): Extension<PgPool>, payload: Option<Form<NewSubscription>>, ) -> Result<(StatusCode, ()), ZTPError> { if let Some(payload) = payload { // Multi-line strings in Rust. Ugly. Would have preferred a macro. let sql = r#"INSERT INTO subscriptions (id, email, name, subscribed_at) VALUES ($1, $2, $3, $4);"#.to_string(); let subscription: Subscription = (&(payload.0)).into(); tracing::info!( "request_id {} - Adding '{}' as a new subscriber.", session.0.to_string(), subscription.name ); // ... ``` And with that, every Request now has a strong ID associated with it: ``` plaintext 2023-03-26T22:19:23.305421Z INFO ztp::routes::subscribe: request_id d0f4a6e7-de0d-48bc-902b-713901c1d63b - Adding 'Elf M. Sternberg' as a new subscriber. ``` That's a very noisy trace; I'd like to start knocking it down to something more like a responsible log, or give me permission to format it the way I like. I'm also getting incredibly noisy messages from the `sqlx::query` call, including the text of the SQL template (the `let sql = ...` line above), which I really don't need every time someone makes a request, and is horribly formatted for principled analytics. Configuring it to return JSON turned out to be easy, although my first pass puzzled me. I had to turn `json` formatting on as a feature: ``` sh $ cargo add --features=json tracing_subscriber ``` And then it was possible to configure the format: ``` rust // in lib.rs:app() // ... let format = tracing_subscriber::fmt::format() .with_level(false) // don't include levels in formatted output .with_thread_names(true) .json(); // include the name of the current thread tracing_subscriber::fmt().event_format(format).init(); // ... ``` ``` json { "timestamp":"2023-03-26T22:53:13.091366Z", "fields": { "message":"request_id 479014e2-5f13-4e12-8401-34d8f8bf1a18 - " "Adding 'Elf M. Sternberg' as a new subscriber."}, "target":"ztp::routes::subscribe", "threadName":"tokio-runtime-worker" } ``` This pretty much concludes my week-long foray into Palmieri's book; I'm not going to worry too much about the deployment stuff, since that's part of my daytime job and I'm not interesting in going over it again. Overall, this was an excellent book for teaching me many of the basics, and provides a really good introduction into the way application servers can be written in Rust. I disagree with the premise that "the language doesn't mean anything to the outcome," as I've heard some people say, nor do I think using Rust is some kind of badge of honor. Instead, I think it's a mark of a responsible developer, one who can produce code that works well the first time, and with some hard thinking about how types work (and some heavy-duty exposure to Haskell), Rust development can be your first thought, not your "I need speed!" thought, when developing HTTP-based application servers.
2023-03-26 23:03:24 +00:00
"tracing-serde",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
]
[[package]]
name = "try-lock"
version = "0.2.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3528ecfd12c466c6f163363caf2d02a71161dd5e1cc6ae7b34207ea2d42d81ed"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "typenum"
version = "1.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "497961ef93d974e23eb6f433eb5fe1b7930b659f06d12dec6fc44a8f554c0bba"
[[package]]
name = "ucd-trie"
version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9e79c4d996edb816c91e4308506774452e55e95c3c9de07b6729e17e15a5ef81"
[[package]]
name = "unicode-bidi"
version = "0.3.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "92888ba5573ff080736b3648696b70cafad7d250551175acbaa4e0385b3e1460"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "unicode-ident"
version = "1.0.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e5464a87b239f13a63a501f2701565754bae92d243d4bb7eb12f6d57d2269bf4"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "unicode-normalization"
version = "0.1.22"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5c5713f0fc4b5db668a2ac63cdb7bb4469d8c9fed047b1d0292cc7b0ce2ba921"
dependencies = [
"tinyvec",
]
[[package]]
name = "unicode-segmentation"
version = "1.10.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1dd624098567895118886609431a7c3b8f516e41d30e0643f03d94592a147e36"
[[package]]
name = "unicode-width"
version = "0.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c0edd1e5b14653f783770bce4a4dabb4a5108a5370a5f5d8cfe8710c361f6c8b"
[[package]]
name = "unicode_categories"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "39ec24b3121d976906ece63c9daad25b85969647682eee313cb5779fdd69e14e"
[[package]]
name = "url"
version = "2.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0d68c799ae75762b8c3fe375feb6600ef5602c883c5d21eb51c09f22b83c4643"
dependencies = [
"form_urlencoded",
"idna",
"percent-encoding",
]
[[package]]
name = "uuid"
version = "1.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1674845326ee10d37ca60470760d4288a6f80f304007d92e5c53bab78c9cfd79"
2023-03-24 19:22:28 +00:00
dependencies = [
"getrandom",
"serde",
]
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "valuable"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "830b7e5d4d90034032940e4ace0d9a9a057e7a45cd94e6c007832e39edb82f6d"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "vcpkg"
version = "0.2.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "accd4ea62f7bb7a82fe23066fb0957d48ef677f6eeb8215f372f52e48bb32426"
[[package]]
name = "version_check"
version = "0.9.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "49874b5167b65d7193b8aba1567f5c7d93d001cafc34600cee003eda787e483f"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "want"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1ce8a968cb1cd110d136ff8b819a556d6fb6d919363c61534f6860c7eb172ba0"
dependencies = [
"log",
"try-lock",
]
2023-03-24 19:22:28 +00:00
[[package]]
name = "wasi"
version = "0.10.0+wasi-snapshot-preview1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1a143597ca7c7793eff794def352d41792a93c481eb1042423ff7ff72ba2c31f"
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "wasi"
version = "0.11.0+wasi-snapshot-preview1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9c8d87e72b64a3b4db28d11ce29237c246188f4f51057d65a7eab63b7987e423"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "wasm-bindgen"
version = "0.2.84"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "31f8dcbc21f30d9b8f2ea926ecb58f6b91192c17e9d33594b3df58b2007ca53b"
dependencies = [
"cfg-if",
"wasm-bindgen-macro",
]
[[package]]
name = "wasm-bindgen-backend"
version = "0.2.84"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "95ce90fd5bcc06af55a641a86428ee4229e44e07033963a2290a8e241607ccb9"
dependencies = [
"bumpalo",
"log",
"once_cell",
"proc-macro2",
"quote",
"syn 1.0.109",
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-macro"
version = "0.2.84"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4c21f77c0bedc37fd5dc21f897894a5ca01e7bb159884559461862ae90c0b4c5"
dependencies = [
"quote",
"wasm-bindgen-macro-support",
]
[[package]]
name = "wasm-bindgen-macro-support"
version = "0.2.84"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2aff81306fcac3c7515ad4e177f521b5c9a15f2b08f4e32d823066102f35a5f6"
dependencies = [
"proc-macro2",
"quote",
"syn 1.0.109",
"wasm-bindgen-backend",
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-shared"
version = "0.2.84"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0046fef7e28c3804e5e38bfa31ea2a0f73905319b677e57ebe37e49358989b5d"
[[package]]
name = "web-sys"
version = "0.3.61"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e33b99f4b23ba3eec1a53ac264e35a755f00e966e0065077d6027c0f575b0b97"
dependencies = [
"js-sys",
"wasm-bindgen",
]
[[package]]
name = "whoami"
version = "1.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2c70234412ca409cc04e864e89523cb0fc37f5e1344ebed5a3ebf4192b6b9f68"
dependencies = [
"wasm-bindgen",
"web-sys",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "winapi"
version = "0.3.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5c839a674fcd7a98952e593242ea400abe93992746761e38641405d28b00f419"
dependencies = [
"winapi-i686-pc-windows-gnu",
"winapi-x86_64-pc-windows-gnu",
]
[[package]]
name = "winapi-i686-pc-windows-gnu"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ac3b87c63620426dd9b991e5ce0329eff545bccbbb34f3be09ff6fb6ab51b7b6"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "winapi-util"
version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "70ec6ce85bb158151cae5e5c87f95a8e97d2c0c4b001223f33a334e3ce5de178"
dependencies = [
"winapi",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "winapi-x86_64-pc-windows-gnu"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "712e227841d057c1ee1cd2fb22fa7e5a5461ae8e48fa2ca79ec42cfc1931183f"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "windows"
version = "0.46.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cdacb41e6a96a052c6cb63a144f24900236121c6f63f4f8219fef5977ecb0c25"
dependencies = [
"windows-targets",
]
[[package]]
name = "windows-sys"
version = "0.42.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a3e1820f08b8513f676f7ab6c1f99ff312fb97b553d30ff4dd86f9f15728aa7"
dependencies = [
"windows_aarch64_gnullvm",
"windows_aarch64_msvc",
"windows_i686_gnu",
"windows_i686_msvc",
"windows_x86_64_gnu",
"windows_x86_64_gnullvm",
"windows_x86_64_msvc",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "windows-sys"
version = "0.45.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "75283be5efb2831d37ea142365f009c02ec203cd29a3ebecbc093d52315b66d0"
dependencies = [
"windows-targets",
]
[[package]]
name = "windows-targets"
version = "0.42.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8e5180c00cd44c9b1c88adb3693291f1cd93605ded80c250a75d472756b4d071"
dependencies = [
"windows_aarch64_gnullvm",
"windows_aarch64_msvc",
"windows_i686_gnu",
"windows_i686_msvc",
"windows_x86_64_gnu",
"windows_x86_64_gnullvm",
"windows_x86_64_msvc",
]
[[package]]
name = "windows_aarch64_gnullvm"
version = "0.42.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "597a5118570b68bc08d8d59125332c54f1ba9d9adeedeef5b99b02ba2b0698f8"
[[package]]
name = "windows_aarch64_msvc"
version = "0.42.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e08e8864a60f06ef0d0ff4ba04124db8b0fb3be5776a5cd47641e942e58c4d43"
[[package]]
name = "windows_i686_gnu"
version = "0.42.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c61d927d8da41da96a81f029489353e68739737d3beca43145c8afec9a31a84f"
[[package]]
name = "windows_i686_msvc"
version = "0.42.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "44d840b6ec649f480a41c8d80f9c65108b92d89345dd94027bfe06ac444d1060"
[[package]]
name = "windows_x86_64_gnu"
version = "0.42.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8de912b8b8feb55c064867cf047dda097f92d51efad5b491dfb98f6bbb70cb36"
[[package]]
name = "windows_x86_64_gnullvm"
version = "0.42.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "26d41b46a36d453748aedef1486d5c7a85db22e56aff34643984ea85514e94a3"
[[package]]
name = "windows_x86_64_msvc"
version = "0.42.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9aec5da331524158c6d1a4ac0ab1541149c0b9505fde06423b02f5ef0106b9f0"
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
[[package]]
name = "yaml-rust"
version = "0.4.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "56c1936c4cc7a1c9ab21a1ebb602eb942ba868cbd44a99cb7cdc5892335e1c85"
dependencies = [
"linked-hash-map",
]
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
[[package]]
name = "ztp"
version = "0.1.0"
dependencies = [
"axum",
2023-03-24 19:22:28 +00:00
"chrono",
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
"config",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
"hyper",
Pre-commit checks and test refactorings. Re-reading the text, I made a number of changes. The first is that, while it is nice that Rust allows us to have unit tests in the file whose functionality we're testing, it's also nice to have the tests somewhere separate, and to have the tests be a little more modular. In the `./tests` folder, you can now see the same `health_check` test as the original, but in an isolated and cleaned-up form. Most importantly, the server startup code is now in its own function, with a correct return type that includes a handle to the spawned thread and the address on which that server is listening; tests can be run in parallel on many different ports and a lot of code duplication is eliminated. ``` rust type NullHandle = JoinHandle<()>; async fn spawn_server() -> (SocketAddr, NullHandle) { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>().unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); let handle: NullHandle = tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap() .serve(app().into_make_service()) .await .unwrap(); }); (addr, handle) } ``` It is also possible now to add new tests in a straightforward manner. The Hyper API is not that much different from the Actix request API, and the Axum extractors seem to be straightforward. I suspect that what I'm looking at here with the handle is the idea that, when it goes out of scope, it calls a d In the introduction I said I was going to be neglecting CI/CD, since I'm a solo developer. That's true, but I do like my guardrails. I like not being able to commit garbage to the repository. So I'm going to add some checks, using [Pre-Commit](https://pre-commit.com/). Pre-Commit is a Python program, so we'll start by installing it. I'm using a local Python environment kickstarted with [Pyenv](https://github.com/pyenv/pyenv). ``` sh $ pip install pre-commit ``` And inside your project, in the project root, you hook it up with the following commands: ``` sh $ pre-commit install $ pre-commit sample-config > .pre-commit-config.yaml ``` I'm going with the default from the rust pre-commit collection, so my `.pre-commit-config.yaml` file looks like this: ``` yaml repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v3.1.0 hooks: - id: check-byte-order-marker - id: check-case-conflict - id: check-merge-conflict - id: check-symlinks - id: check-yaml - id: end-of-file-fixer - id: mixed-line-ending - id: trailing-whitespace - repo: https://github.com/pre-commit/pre-commit rev: v2.5.1 hooks: - id: validate_manifest - repo: https://github.com/doublify/pre-commit-rust rev: master hooks: - id: fmt - id: cargo-check - id: clippy ``` ... and with that, every time I try to commit my code, it will not let me until these tests pass. And I *like* that level of discipline. This is low-level validation; it won't catch if I put addition where I meant subtraction, or if I have a comparison going in the wrong direction, but at least the basics are handled and, more importantly, the formatting and styling is consistent throughout all of my code.
2023-03-22 00:52:44 +00:00
"serde",
2023-03-24 19:22:28 +00:00
"serde_json",
Implemented the forms reader, config, and database migrations. This chapter introduces the Actix "Extractors" for retrieving form data. I've added tests to the `./tests` folder to attempt to interact with those extractors; as of this commit, a89cbe5b, they fail because the example code isn't there. What is there is a variant of the "Hello, World!" code from the previous exercises (section 3.5), which uses the Actix extractor: ``` rust // Actix, *not* Axum. Does not work with the current framework. fn index(form: web::Form<FormData>) -> String { format!('Welcome {}!', form.username) } ``` Translated and polished into Axum, it translates to: ``` rust pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { let username = payload.map_or("World".to_string(), move |index| -> String { String::from(&(index.username)) }); (StatusCode::OK, format!("Hello, {}!\n", &username)) } ``` The Axum version is a little smarter, providing a default "World!" if you don't specify a name. That's what `.map_or` does, although the `or` part actually comes first in the function. So the result is: ``` sh $ curl http://localhost:3000/ Hello, World! $ curl 'http://localhost:3000/?username=Spock' Hello, Spock! ``` Which is more or less the version we want. **Section 3.7.3** then goes into some detail on the implementation of a Trait. A Trait in Rust is like an Interface in other languages; it describes a collection of functions for manipulating the values found in a defined Type. **Types**: A Type is just a description of the value: `u16` is a sixteen-bit unsigned integer; `char` is a Unicode character, and it's size is always 32bits, but a `String` is a bunch of things: it's an array of characters (which are not always `char`!), a length for that string and a capacity. If the String is manipulated to exceed the capacity, the array is re-allocated to a new capacity and the old array copied into it. A `Vec<String>` is an array of Strings; as a Type, it is considered to have a single value: whatever is in it at the moment you use it. **Trait**: A Trait defines a collection of one or more functions that can operate on a value. The truly nifty thing about Traits is that they can be implemented after the fact. By importing a Trait and an implementation of that trait specific to a type into a module containing that type, you can extend the behavior of a type in a deterministic way without having to modify or inherit the code, as you would in an object-oriented language. Axum has a valuable trait, `FromRequest`. For any structure you can imagine passing from the client to the server, you can implement `FromRequest` for that object and any content in the body of the message will be transformed into that structure. We've seen a trait before: `IntoResponse`, written as `impl IntoResponse`, and is the output (the return type) of many of the functions that produce return values for our application server. In this case the return type instructs Rust to look in the current lexical scope and, for the value returned by that function, determine if an `IntoResponse` trait has been defined for it. If it has, the value will be returned because Axum has now been assured that there exists a function to convert that value into something streamable and usable as an HTTP response. Fortunately for us, Axum has already implemented `FromRequest` for all the native data types, as well as some structures and arrays. Even better, it has implemented `FromRequest` for the Serde serialization/deserialization library. So in this example: ``` rust pub struct FormData { username: String, } pub async fn index(payload: Option<Form<FormData>>) -> impl IntoResponse { ... ``` A `Form` (something using the `application/x-www-form-urlencoded` protocol) of `FormData` will automatically be converted into a `payload` object of `{ username: "Spock" )`, and in this case wrapped in a `Some()` handler. (Or `None`, if there was no form included.) <aside>So far, there's not too much bloat in this product; with all the debugging symbols, it's 60MB or so, but stripped to the bone it's only 3.1MB, tolerable for modern deployments.</aside> First, though, we must adjust our `valid_subscription` test: ``` rust let body = "name=le%20guin&email=ursula_le_guin%40gmail.com"; let response = hyper::Client::new() .request( Request::builder() .method("POST") .header("content-type", "application/x-www-form-urlencoded") .uri(format!("http://{}/subscriptions", addr)) .body(Body::from(body)) .unwrap(), ) .await .expect("Failed to execute request."); ``` Two updates from the book: first, we're sending it via POST instead of GET. This is the correct way to do things; a GET should never (and I mean *never*) cause a change of state on the back-end. To send something new that the server will process and store, you use a POST. (To update something, or to send something to a known *and unique* URI, PUT is better.) Secondly, since we're using a generic form-data object, we need to set the content-type on the client so that the server is informed of how to unpack this payload. The '%20' and '%40' markers in the `body` are the space and the `@` respectively. I completely ignored the advice in the book and went instead with [Dbmate](https://github.com/amacneil/dbmate); Dbmate is a bit cranky; your SQL must be very much nestled against the 'up' and 'down' markers in the migration file, and it seems to be quite opinionated about everything being lowercase. That said, it was trivial to create a database with it: ``` sh $ dbmate new create_subscriptions_table ``` This will create the folder `db/migrations/20230322174957_create_subscriptions_table.sql`, (The timestamp will be different, obviously), and in this file you put the following, as specified in the book: ``` sql -- migrate:up create table subscriptions ( id uuid not null, primary key (id), email text not null unique, name text not null, subscribed_at timestamptz not null ); -- migrate:down drop table subscriptions; ``` To use Dbmate, you have to specify how to connect. I'm using Postgres, so let's start with creating a database and a user for it: ``` sh $ sudo -u postgres psql [sudo] possword for user: ................... postgres=# create database newsletter; CREATE DATABASE postgres=# create user newletter with encrypted password 'redacted'; CREATE USER postgres=# grant all privileges on database newsletter to newsletter; GRANT postgres=# exit ``` In your project root, create a `.env` file to specify your connection: ``` sh DATABASE_URL="postgres://newsletter:redacted@127.0.0.1:5432/newsletter?sslmode=disable" ``` The `sslmode` flag there is necessary for localhost connections, as Dbmate assumes an encrypted connection by default, but we're isolating to a local connection that is, usually, safe. With the new entry in your `.env` file, you can now run a migration: ``` sh $ dbmate up Writing: ./db/schema.sql ``` Running `dbmate up` will automatically create the database for you if it hasn't already; `dbmate migrate` also performs migrations, but it will not create the database. Now you can re-connect to Postgres as the newsletter user and see what you've got: ``` sh $ psql --user newsletter -h localhost --password Password: psql (14.7 (Ubuntu 14.7-0ubuntu0.22.04.1), server 11.7 (Ubuntu 11.7-0ubuntu0.19.10.1)) newsletter=> \d List of relations Schema | Name | Type | Owner --------+-------------------+-------+------------ public | schema_migrations | table | newsletter public | subscriptions | table | newsletter (2 rows) newsletter=> \d subscriptions Table "public.subscriptions" Column | Type | Collation | Nullable | Default ---------------+--------------------------+-----------+----------+--------- id | uuid | | not null | email | text | | not null | name | text | | not null | subscribed_at | timestamp with time zone | | not null | Indexes: "subscriptions_pkey" PRIMARY KEY, btree (id) "subscriptions_email_key" UNIQUE CONSTRAINT, btree (email) ``` Note that Dbmate has allocated a table to itself, `schema_migrations`, for tracking what it's done to your system and when. Try not to conflict with it, okay? Every complex app has a configuration, and there are plenty of different ways the configuration can be specified. Environment variables, internal defaults, and configuration files-- the last of which comes in so many different flavors. Rust has a well-known [config](https://docs.rs/config/latest/config/index.html) crate that supports all the most common configurations: YAML, JSON, TOML; you can even add your own by writing something that implements the `config::Format` trait. Add it to Cargo.toml: ``` sh $ cargo add config ``` For the meantime, we're just going to create a new file, called `configuration.rs`, and put our configuration details in there. Right now we have a single configuration detail: the port. I'm going to go above and beyond Lucas here and configure some internal defaults for my code. It will have expectations. First, you have to tell Serde that there will be default values: ``` rust use config::Config; pub struct Settings { pub port: u16, } ``` Then, you have to set those default values. Fortunately, Rust provides a "set default values" trait named, sensibly enough, Default: ``` rust impl Default for Settings { fn default() -> Self { Settings { port: 3001 } } } ``` Again, exceeding the book's parameters, I'm going to say that if the file is missing the default parameters should hold: ``` rust pub(crate) fn get_configuration() -> Result<Settings, config::ConfigError> { Config::builder() .add_source(config::File::with_name("./ztd.config").required(false)) .build()? .try_deserialize() } ``` And since this is the first time I'm doing this, I'm going to write a test to assert that my understanding of how this all works is correct: ``` rust mod tests { use super::*; #[test] fn test_for_defaults() { let maybe_config = get_configuration(); assert!(!maybe_config.is_err()); let config = maybe_config.unwrap(); assert_eq!(config.port, 3001); } } ```
2023-03-24 14:51:19 +00:00
"sqlx",
2023-03-24 19:22:28 +00:00
"thiserror",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
"tokio",
"tower",
Added Telemetry: Logging and Analytics In Chapter 4, Palmieri focuses on logging and telemetry. Axum is very different from Actix, and my first foray into trying to understand it led me to [Tower](https://docs.rs/tower/latest/tower/), the Rust community's de-facto standards for modular networking development and design. I completely short-circuited much of what the book recommended and, instead, just went with the most basic implementation possible. I added the tracing libraries as recommended by the Axum developers, and then implemented the first level of tracing as recommended by Tower: ``` sh $ cargo add --features tower_http/trace,tracing tower tower_http tracing tracing_subscriber ``` And then I updated the app startup code to include it: ``` rust pub async fn app(configuration: &Settings) -> Router { tracing_subscriber::fmt::init(); let pool = PgPoolOptions::new() .max_connections(50) .connect(&configuration.database.url()) .await .expect("could not connect to database_url"); routes().layer(Extension(pool)).layer(TraceLayer::new_for_http()) } ``` That is literally all that was needed. And the output is: ``` plaintext 2023-03-25T16:49:06.385563Z DEBUG request{method=GET uri=/ version=HTTP/1.1}: tower_http::trace::on_request: started processing request 2023-03-25T16:49:06.386270Z DEBUG request{method=GET uri=/ version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=0 ms status=200 ``` That's not great logging, but it's a start. As I understand it, `tracing_subscriber::fmt::init()` initializes the formatter, but I'm confused as to where this is saved or stored, since it seems to be... nowhere. The deeper Rust gets, the wilder it seems. What I did manage was to create, [as recommended by Chris Allen](https://bitemyapp.com/blog/notes-on-zero2prod-rust/), a very simple Layer that shoves a new object into the collection of data being passed around by the request. That object contains a unique UUID for the session being processed. Since Tokio is a multi-threaded system, having a UUID allows us to trace each individual request from beginning to end... provided I've hooked up by handlers just right. I learned most of this by reading the [Axum Session source code](https://docs.rs/axum-sessions/latest/src/axum_sessions/session.rs.html), which implements something much more complex. Since we're at a deeper level of the service handling I need a function takes a Request and returns a Response, and in the middle inserts a SessionId into the Request passed in; by giving the type a name any handlers can now find and use that SessionId: ``` rust /// In file `session_id.rs` pub struct SessionId(pub Uuid); pub async fn session_id<B>(mut req: Request<B>, next: Next<B>) -> Result<Response, StatusCode> { req.extensions_mut().insert(SessionId(Uuid::new_v4())); Ok(next.run(req).await) } ``` With that, I now need to add it to the layers initialized with the app object: ```rust /// In lib.rs:pub async fn app()`: routes() .layer(Extension(pool)) .layer(TraceLayer::new_for_http()) .layer(middleware::from_fn(session_id::session_id)) ``` And with that, the SessionId is available. Since it's the outermost layer, it can now be used by anything deeper in. Let's add it to the `subscribe` function: ``` rust /// In routes/subscribe.rs/subscribe() pub(crate) async fn subscribe( Extension(session): Extension<SessionId>, Extension(pool): Extension<PgPool>, payload: Option<Form<NewSubscription>>, ) -> Result<(StatusCode, ()), ZTPError> { if let Some(payload) = payload { // Multi-line strings in Rust. Ugly. Would have preferred a macro. let sql = r#"INSERT INTO subscriptions (id, email, name, subscribed_at) VALUES ($1, $2, $3, $4);"#.to_string(); let subscription: Subscription = (&(payload.0)).into(); tracing::info!( "request_id {} - Adding '{}' as a new subscriber.", session.0.to_string(), subscription.name ); // ... ``` And with that, every Request now has a strong ID associated with it: ``` plaintext 2023-03-26T22:19:23.305421Z INFO ztp::routes::subscribe: request_id d0f4a6e7-de0d-48bc-902b-713901c1d63b - Adding 'Elf M. Sternberg' as a new subscriber. ``` That's a very noisy trace; I'd like to start knocking it down to something more like a responsible log, or give me permission to format it the way I like. I'm also getting incredibly noisy messages from the `sqlx::query` call, including the text of the SQL template (the `let sql = ...` line above), which I really don't need every time someone makes a request, and is horribly formatted for principled analytics. Configuring it to return JSON turned out to be easy, although my first pass puzzled me. I had to turn `json` formatting on as a feature: ``` sh $ cargo add --features=json tracing_subscriber ``` And then it was possible to configure the format: ``` rust // in lib.rs:app() // ... let format = tracing_subscriber::fmt::format() .with_level(false) // don't include levels in formatted output .with_thread_names(true) .json(); // include the name of the current thread tracing_subscriber::fmt().event_format(format).init(); // ... ``` ``` json { "timestamp":"2023-03-26T22:53:13.091366Z", "fields": { "message":"request_id 479014e2-5f13-4e12-8401-34d8f8bf1a18 - " "Adding 'Elf M. Sternberg' as a new subscriber."}, "target":"ztp::routes::subscribe", "threadName":"tokio-runtime-worker" } ``` This pretty much concludes my week-long foray into Palmieri's book; I'm not going to worry too much about the deployment stuff, since that's part of my daytime job and I'm not interesting in going over it again. Overall, this was an excellent book for teaching me many of the basics, and provides a really good introduction into the way application servers can be written in Rust. I disagree with the premise that "the language doesn't mean anything to the outcome," as I've heard some people say, nor do I think using Rust is some kind of badge of honor. Instead, I think it's a mark of a responsible developer, one who can produce code that works well the first time, and with some hard thinking about how types work (and some heavy-duty exposure to Haskell), Rust development can be your first thought, not your "I need speed!" thought, when developing HTTP-based application servers.
2023-03-26 23:03:24 +00:00
"tower-http",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
"tracing",
"tracing-subscriber",
2023-03-24 19:22:28 +00:00
"uuid",
Zero-to-Production Rust, up to Chapter 3.7. Since this book is about learning Rust, primarily in a microservices environment, this chapter focuses on installing Rust and describing the tools available to the developer. The easiest way to install Rust is to install the [Rustup](https://rustup.rs/) tool. It is one of those blind-trust-in-the-safety-of-the-toolchain things. For Linux and Mac users, the command is a shell script that installs to a user's local account: ``` $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` Once installed, you can install Rust itself: ``` $ rustup install toolchain stable ``` You should now have Rust compiler and the Rust build and packaging tool, known as Cargo: ``` $ rustc --version rustc 1.68.0 (2c8cc3432 2023-03-06) $ cargo --version cargo 1.68.0 (115f34552 2023-02-26) ``` I also installed the following tools: ``` $ rustup component add clippy rust-src rust-docs $ cargo install rustfmt rust-analyzer ``` - clippy: A powerful linter that provides useful advice above and beyond the compiler's basic error checking. - rustfmt: A formatting tool that provides a common format for most developers - rust-analyzer: For your IDE, rust-analyzer provides the LSP (Language Server Protocol) for Rust, giving you code completion, on-the-fly error definition, and other luxuries. Zero-to-Production's project is writing a web service that signs people up for an email newsletter. The first task in the book is to set up a "Hello World!" application server. The book uses the [Actix-web](https://actix.rs/) web framework, but I've chosen to implement it using [Axum](https://github.com/tokio-rs/axum) server, the default server provided by the [Tokio](https://github.com/tokio-rs/tokio) asynchronous runtime. Although the book is only two years old, it is already out-of-date with respect to some commands. `cargo add` is now provided by default. The following commands installed the tools I'll be using: ``` cargo add --features tokio/full --features hyper/full tokio hyper \ axum tower tracing tracing-subscriber ``` - axum: The web server framework for Tokio. - tokio: The Rust asynchronous runtime. Has single-threaded (select) and multi-threaded variants. - [hyper](https://hyper.rs/): An HTTPS request/response library, used for testing. - [tracing](https://crates.io/crates/tracing): A debugging library that works with Tokio. We start by defining the core services. In the book, they're a greeter ("Hello, World"), a greeter with a parameter ("Hello, {name}"), and a health check (returns a HTTP 200 Code, but no body). Actix-web hands a generic Request and expects a generic request, but Axum is more straightforward, providing `IntoResponse` handlers for most of the basic Rust types, as well as some for formats via Serde, Rust's standard serializing/deserializing library for converting data from one format to another. All of these go into `src/lib.rs`: ``` async fn health_check() -> impl IntoResponse { (StatusCode::OK, ()) } async fn anon_greet() -> &'static str { "Hello World!\n" } async fn greet(Path(name): Path<String>) -> impl IntoResponse { let greeting = String::from("He's dead, ") + name.as_str(); let greeting = greeting + &String::from("!\n"); (StatusCode::OK, greeting) } ``` <aside>Axum's documentation says to [avoid using `impl IntoResponse`](https://docs.rs/axum/latest/axum/response/index.html#regarding-impl-intoresponse) until you understand how it really works, as it can result in confusing issues when chaining response handlers, when a handler can return multiple types, or when a handler can return either a type or a [`Result<T, E>`](https://doc.rust-lang.org/std/result/), especially one with an error.</aside> We then define the routes that our server will recognize. This is straightforward and familiar territory: ``` fn app() -> Router { Router::new() .route("/", get(anon_greet)) .route("/:name", get(greet)) .route("/health_check", get(health_check)) } ``` We then define a function to *run* the core server: ``` pub async fn run() { let addr = SocketAddr::from(([127, 0, 0, 1], 3000)); tracing::info!("listening on {}", addr); axum::Server::bind(&addr) .serve(app().into_make_service()) .await .unwrap() } ``` And finally, in a file named `src/main.rs`, we instantiate the server: ``` use ztp::run; async fn main() { run().await } ``` To make this "work," we need to define what `ztp` means, and make a distinction between the library and the CLI program. In the project root's `Cargo.toml` file, the first three sections are needed to define these relationships: ``` [package] name = "ztp" version = "0.1.0" edition = "2021" [lib] path = "src/lib.rs" [[bin]] path = "src/main.rs" name = "ztp" ``` It is the `[package.name]` feature that defines how the `use` statement in `main.rs` will find the library. The `[[bin]]` clause defines the name of the binary when it is generated. <aside>The double brackets around the `[[bin]]` clauses is there to emphasize to the TOML parser that there can be more than one binary. There can be only one library per package, but it is possible for a Rust project to have more than one package, called "crates," per project. </aside> This project should now be runnable. In one window, type: ``` $ cargo run ``` And in another, type and see the replies: ``` $ curl http://localhost:3000/ Hello, World! $ curl http://localhost:3000/Jim He's dead, Jim! $ curl -v http://localhost:3000/health_check > GET /health_check HTTP/1.1 > Host: localhost:3000 > User-Agent: curl/7.81.0 > Accept: */* < HTTP/1.1 200 OK < content-length: 0 < date: Tue, 21 Mar 2023 00:16:43 GMT ``` In the last command, the *verbose* flag shows us what we sent to the server, and what came back. We expected a "200 OK" flag and a zero-length body, and that's what we got. In order to unit-test a web server, we must spawn a copy of it in order to exercise its functions. We'll use Tokio's `spawn` function to create a new server, use hyper to request data from the server, and finally Rust's own native test asserts to check that we got what we expected. ``` mod tests { use super::*; use axum::{ body::Body, http::{Request, StatusCode}, }; use std::net::{SocketAddr, TcpListener}; #[tokio::test] async fn the_real_deal() { let listener = TcpListener::bind("127.0.0.1:0".parse::<SocketAddr>() .unwrap()).unwrap(); let addr = listener.local_addr().unwrap(); tokio::spawn(async move { axum::Server::from_tcp(listener) .unwrap()serve(app().into_make_service()).await.unwrap(); }); let response = hyper::Client::new() .request( Request::builder().uri(format!("http://{}/", addr)) .body(Body::empty()).unwrap(), ) .await .unwrap(); let body = hyper::body::to_bytes(response.into_body()).await.unwrap(); assert_eq!(&body[..], b"Hello World!\n"); } } ``` One interesting trick to observe in this testing is the port number specified in the `TcpListener` call. It's zero. When the port is zero, the `TcpListener` will request from the kernel the first-free-port. Normally, you'd want to know exactly what port to call the server on, but in this case both ends of the communication are aware of the port to use and we want to ensure that port isn't hard-coded and inconveniently already in-use by someone else.
2023-03-21 00:31:39 +00:00
]