From 9798e80deb7c69cf9412a7ab97a9852f4e31ef2c Mon Sep 17 00:00:00 2001 From: Paul Ebose Date: Fri, 27 Feb 2026 03:41:27 +0100 Subject: [PATCH 1/2] improve grammar in 103_tokenization --- exercises/103_tokenization.zig | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/exercises/103_tokenization.zig b/exercises/103_tokenization.zig index 972e8e8..5c88c21 100644 --- a/exercises/103_tokenization.zig +++ b/exercises/103_tokenization.zig @@ -24,8 +24,7 @@ // suited to understand the basic principles. // // In the following exercises we will also read and process data from -// large files and at the latest then it will be clear to everyone how -// useful all this is. +// large files, it will then be clearer to you how useful all this is. // // Let's start with the analysis of the example from the Zig homepage // and explain the most important things. From 7d03b8464d29051e3269a810cc9670c190dab165 Mon Sep 17 00:00:00 2001 From: Paul Ebose Date: Fri, 27 Feb 2026 19:27:04 +0100 Subject: [PATCH 2/2] update patch file for 103_tokenization --- patches/patches/103_tokenization.patch | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/patches/patches/103_tokenization.patch b/patches/patches/103_tokenization.patch index 01d68fc..941ca3b 100644 --- a/patches/patches/103_tokenization.patch +++ b/patches/patches/103_tokenization.patch @@ -1,6 +1,6 @@ ---- exercises/103_tokenization.zig 2023-10-05 21:57:23.245974688 +0200 -+++ answers/103_tokenization.zig 2023-10-05 22:06:08.319119156 +0200 -@@ -136,7 +136,7 @@ +--- exercises/103_tokenization.zig 2026-02-27 19:25:11 ++++ answers/103_tokenization.zig 2026-02-27 19:26:04 +@@ -134,7 +134,7 @@ ; // now the tokenizer, but what do we need here?