From 9798e80deb7c69cf9412a7ab97a9852f4e31ef2c Mon Sep 17 00:00:00 2001 From: Paul Ebose Date: Fri, 27 Feb 2026 03:41:27 +0100 Subject: [PATCH] improve grammar in 103_tokenization --- exercises/103_tokenization.zig | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/exercises/103_tokenization.zig b/exercises/103_tokenization.zig index 972e8e8..5c88c21 100644 --- a/exercises/103_tokenization.zig +++ b/exercises/103_tokenization.zig @@ -24,8 +24,7 @@ // suited to understand the basic principles. // // In the following exercises we will also read and process data from -// large files and at the latest then it will be clear to everyone how -// useful all this is. +// large files, it will then be clearer to you how useful all this is. // // Let's start with the analysis of the example from the Zig homepage // and explain the most important things.