Seems quite cool! Though the demo gif with (what seem to be?) broken icons is a bold choice :p
w108bmg 13 hours ago [-]
lol good catch, and I totally missed it
I used vhs to record the gif which must not run the script in my native terminal! I’ll have to see about fixing it!
wonger_ 1 hours ago [-]
One solution could be to use the docker version of vhs, and edit the dockerfile to pull your desired font.
yonatan8070 12 hours ago [-]
I've seen a lot of people use asciinema to record and share terminal recordings, it works quite well
nightpool 2 hours ago [-]
How would that solve the font problem? I feel like that would only make the problem of having unsupported fonts even worse.
diggan 5 hours ago [-]
You still end up having to turn it into a GIF if you want it to autoplay on GitHub's markdown viewer, or video if you want it to run on the page but require a click-to-play.
yonatan8070 2 hours ago [-]
Huh, I though you could embed those into READMEs on GitHub, but turns out you can't
sdegutis 13 hours ago [-]
I didn't notice, I was too busy seeing how impressive and useful this tool is.
And with fuzzy matching built in? Just amazing. Good job OP.
b0a04gl 1 hours ago [-]
crazy good, so it'll ignore showing files enlisted in gitignore while listing?
drabbiticus 14 hours ago [-]
First off, the display looks great!
Second off, I didn't realize how deep the dep tree would be for this type of program -- 141 total! So much of it is the url crate, itself a dep of the git crate, but there's a bunch of others too. I'm just getting into learning Rust -- is this typical of Rust projects or perhaps typical of TUI projects in general?
(EDIT to strikeout) ~~The binary is also 53M as a result whereas /usr/sbin/tree is 80K on my machine -- not really a problem on today's storage, but very roughly 500-1000x different in size isn't nothing.~~
Maybe it's linking-related? I don't know how to check really.
(EDIT: many have pointed out that you can run `cargo build --release` with other options to get a much smaller binary. Thanks for teaching me!)
JoshTriplett 13 hours ago [-]
> The binary is also 53M
That's a debug binary, and the vast majority of that is debug symbols. A release build of this project is 4.3M, an order of magnitude smaller.
Also, compiling out the default features of the git2 crate eliminates several dependencies and reduces it further to 3.6M.
Stripping the binary further improves it to 2.9M, and some further optimizations get it to 2.2M without any compromise to performance. (You can get it smaller by optimizing for size, but I wouldn't recommend that unless you really do value size more than performance.)
esafak 13 hours ago [-]
No offense, but 4.3MB is huge for what it does. Most shells take less space than that! Where's all the bloat coming from?
koito17 12 hours ago [-]
> Most shells take less space than that!
Most shells dynamically link to a runtime your OS provides "for free". The 4.3 MiB binary in question is bundling the Rust runtime and its dependencies.
For reference, a statically-compiled C++ "Hello, World" is 2.2 MiB after stripping.
2.2MiB for "Hello, World"? I must be getting old...
The executable takes 33KB in C, 75KB in nim.
koito17 12 hours ago [-]
By switching to e.g. musl, you can go down to a single megabyte ;)
But in all seriousness, my example is quite cherrypicked, since nobody will actually statically link glibc. And even if they did, one can make use of link-time optimization to remove lots of patches of unused code. Note that this is the same strategy one would employ to debloat their Rust binaries. (Use LTO, don't aggressively inline code, etc.)
3836293648 11 hours ago [-]
We just have large standard libraries now
surajrmal 3 hours ago [-]
lto will remove most of it.
wahern 11 hours ago [-]
> Most shells dynamically link to a runtime your OS provides "for free"
Rust binaries also dynamically link to and rely on this runtime.
mtndew4brkfst 2 hours ago [-]
That's not intrinsically or pervasively true, although it's not uncommon.
fuzztester 4 hours ago [-]
why did you embed the c++ code in the .nix file?
just to have everything in one file? how to show how to do it with nix?
because it seem simpler to have a separate C++ file, and a simple shell script or makefile to compile it.
e.g. although I could figure out roughly what the .nix file does, many more people would know plain unix shell than nix.
and where is $out defined in the .nix file?
AnthOlei 3 hours ago [-]
The nix file is besides the point - it gives you a totally hermetic build environment. Not OP, but it’s the only way I know how to get gcc to use a static glibc. All you should pay attention to is that it’s using a static glibc.
$out is a magic variable in nix that means the output of the derivation - the directory that nix moves to its final destination
o11c 13 hours ago [-]
For reference, some statically-linked shells on my system:
2288K /bin/bash-static (per manual, "too big and too slow")
1936K /bin/busybox-static (including tools not just the shell)
192K /usr/lib/klibc/bin/mksh
2456K zsh-static
For comparison, some dynamically-linked binaries (some old)
(The reason I don't have static binaries handy is because they no longer run on modern systems. As long as you aren't using shitty libraries, dynamic binaries are more portable and reliable, contrary to internet "wisdom".)
JoshTriplett 12 hours ago [-]
Among the features it has: an interactive terminal GUI, threaded parallel directory walking, and git repository support. In around a thousand lines of code, total, including tests, half of which is the GUI.
oguz-ismail 12 hours ago [-]
*TUI. Not GUI
have-a-break 13 hours ago [-]
I feel like that's just the result of having a native package manager making natural bloat and a compiler which hasn't had decades of work.
I did some benchmarks on one of our CLI and found that `opt-level = "z"` reduced the size from 2.68M to 2.28M, and shaved 10% on the build time, worth a try.
I'll try with `panic = "abort"` for our next release, thanks for the reminder.
fabrice_d 14 hours ago [-]
You are probably looking at a debug build. On Linux, a release build (cargo build -r) is ~4.3M, and down to ~3.5M once stripped. This could be reduced further with some tricks applied to the release build profile.
getcrunk 13 hours ago [-]
Great catch! Comments mentioned getting it down to ~2MB but that’s still humongous.
If you just think about how roughly (napkin math) 2MB can be 100k loc, that’s nuts
arlort 13 hours ago [-]
Is It though? You won't get it on an embedded device (maybe) but you could install a thousand of these tools and barely even notice the space being taken up on most machines
getcrunk 13 hours ago [-]
I think that’s a lame argument. First because it’s kind of a fallacy. Size is absolute not relative to something. Especially for software. No one thinks of software size primarily in the context of their disk space.
Further I think everyone keeps getting larger and larger memory because software keeps getting more and more bloated.
I remember when 64gb iPhone was more than enough (I don’t take pictures so just apps and data)
Now my 128 is getting uncomfortable due to the os and app sizes. My next phone likely will be a 256
pxc 11 minutes ago [-]
Size may be absolute, but bigness and smallness are inherently and inescapably relative.
hnlmorg 9 hours ago [-]
I’m usually the first to complain about bloat but your counterpoints to the GPs “lame arguments” are themselves, fallacies.
> First because it’s kind of a fallacy. Size is absolute not relative to something. Especially for software. No one thinks of software size primarily in the context of their disk space.
That’s exactly how most people think about file sizes.
When your disk is full, you don’t delete the smallest files first. You delete the biggest.
> Further I think everyone keeps getting larger and larger memory because software keeps getting more and more bloated.
RAM sizes have actually stagnated over the last decade.
> I remember when 64gb iPhone was more than enough (I don’t take pictures so just apps and data) Now my 128 is getting uncomfortable due to the os and app sizes. My next phone likely will be a 256
That’s because media sizes increase, not executable sizes.
And people do want higher resolution cameras, higher definition videos, improved audio quality, etc. These are genuinely desirable features.
Couple that with improved internet bandwidth allowing for content providers to push higher bitrate media, however the need to still locally cache media.
nicoburns 9 hours ago [-]
> That’s because media sizes increase, not executable sizes.
Part of it is app sizes on mobile. But it's apps in the 200mb - 2gb range that are the problem, not ones that single-digit megabytes.
hnlmorg 6 hours ago [-]
200MB apps wouldn’t even make a dent on a 64GB device.
The 2GB apps are usually so large because they include high quality media assets. For example, Spotify will frequently consumer multiple GBs of storage but the vast majority of that is audio cache.
nicoburns 5 hours ago [-]
I currently have 355 apps installed on my phone, so if they were all 200mb then they wouldn't fit on a 64GB device.
I agree that the largest data use tends to be media assets.
hnlmorg 4 hours ago [-]
I’m intrigued, how many of them are actual 3rd party apps though? And how many are different layers around an existing app or part of Apple / Googles base OS? The latter, in fairness, consumes several GBs of storage too.
I’m not trying to dismiss your point here. Genuinely curious how you’ve accumulated so many app installs.
nicoburns 2 hours ago [-]
It's an interesting question. Some of them are definitely from the OS (either Google or Samsung).
Looking through at categories of app where I have multiple, I'm seeing:
- Transport provider apps (Airlines, Trains, Buses, Taxis etc)
- Parking payment apps
- Food delivery apps
- Hotel apps
- Payment apps
- Messaging / Video calling apps
- Banking apps
- Mapping apps
It's especially easy to accumulate a lot of apps if you travel through multiple countries, as for a lot of these apps you need different ones in different countries.
ghosty141 7 hours ago [-]
> No one thinks of software size primarily in the context of their disk space.
This is wrong. The reason why many old tools are so small was because you had far less space. If you have a 20tb harddrive you wouldn't care about whether ls took up 1kb or 2mb, on a 1gb harddrive it matters/ed much more.
Optimization takes time, I'm sure if OP wanted he could shrink the binary size by quite a lot but doing so has its costs and nowadays its rarely worth paying that since nobody even notices wether a program is 2kb or 2mb. It doesn't matter anymore in the age of 1TB bootdrives.
dotancohen 11 hours ago [-]
So bloated software is motivating you to spend more for the larger capacity phone?
What incentive does Apple have to help iOS devs get package sizes down, then?
vlovich123 12 hours ago [-]
When you include the code for all the dependency features this uses, you probably do end up close to 100k LoC net, no?
mtndew4brkfst 2 hours ago [-]
lib.rs has a nifty (and occasionally shocking) portrayal of this on their crate pages.
says for this one the deps clock in at:
~19–29MB
~487K SLoC
ethan_smith 13 hours ago [-]
Try `cargo build --release --no-default-features` to get a much smaller binary (~5-10MB) - Rust statically links dependencies but supports conditional compilation for optional features.
aystatic 13 hours ago [-]
Glancing at the Cargo.toml, the package doesn't define any features anyways. `cargo b --no-default-features` only applies to the packages you're building, not their dependencies -- that would lead to very unpredictable behavior
w108bmg 13 hours ago [-]
Really appreciate all the comments and useful feedback (first Rust package). Especially ways to reduce the size of the binary!
So after writing this to learn Rust, what are your thoughts on Rust? What do you especially like and dislike about it, or what were you surprised about?
w108bmg 13 hours ago [-]
I appreciate the ecosystem of packages that seem really well maintained. I don’t love the syntax and find Rust harder to read and learn so far compared to something like golang (I’m used to R which is not a compiled language but has a great dev community).
I do love the compiler and support tools built into Cargo (fmt, clippy, etc.).
sdegutis 12 hours ago [-]
That's been similar to my experience. The ecosystem is extremely polished and smooth, the build tools and package manager and IDE support, all of it. Especially compared to C++ which I cuold barely get working here.
rewgs 9 hours ago [-]
This looks awesome! Right up my alley.
Side-note: what theme are you using in the linked gif? It's right in the middle of my two favorite themes, onedark and gruvbox.
berkes 9 hours ago [-]
I really love all the "modern" takes on classic tools by the Rust community.
I'm using eza (aka exa), aliased as ls, which has "tree" built in (aliased as lt), amongst others, as replacement for "ls" and it's one of my biggest production boosts in daily commandline use. Because eza has the tree built in, and the tree is also insanely fast, I won't be needing this tool - yet. Maybe one day the interactive mode will pull me over.
Congrats on releasing. And kudo's to how well you've released it: solid README, good description, good-looking gifs with exactly the right feature highlights
fer 7 hours ago [-]
On interactive mode "ls tree" tools, an interesting one is broot.
I used vhs to record the gif which must not run the script in my native terminal! I’ll have to see about fixing it!
And with fuzzy matching built in? Just amazing. Good job OP.
Second off, I didn't realize how deep the dep tree would be for this type of program -- 141 total! So much of it is the url crate, itself a dep of the git crate, but there's a bunch of others too. I'm just getting into learning Rust -- is this typical of Rust projects or perhaps typical of TUI projects in general?
(EDIT to strikeout) ~~The binary is also 53M as a result whereas /usr/sbin/tree is 80K on my machine -- not really a problem on today's storage, but very roughly 500-1000x different in size isn't nothing.~~
Maybe it's linking-related? I don't know how to check really.
(EDIT: many have pointed out that you can run `cargo build --release` with other options to get a much smaller binary. Thanks for teaching me!)
That's a debug binary, and the vast majority of that is debug symbols. A release build of this project is 4.3M, an order of magnitude smaller.
Also, compiling out the default features of the git2 crate eliminates several dependencies and reduces it further to 3.6M.
https://github.com/bgreenwell/lstr/pull/5
https://github.com/rust-lang/git2-rs/pull/1168
Stripping the binary further improves it to 2.9M, and some further optimizations get it to 2.2M without any compromise to performance. (You can get it smaller by optimizing for size, but I wouldn't recommend that unless you really do value size more than performance.)
Most shells dynamically link to a runtime your OS provides "for free". The 4.3 MiB binary in question is bundling the Rust runtime and its dependencies.
For reference, a statically-compiled C++ "Hello, World" is 2.2 MiB after stripping.
The executable takes 33KB in C, 75KB in nim.
But in all seriousness, my example is quite cherrypicked, since nobody will actually statically link glibc. And even if they did, one can make use of link-time optimization to remove lots of patches of unused code. Note that this is the same strategy one would employ to debloat their Rust binaries. (Use LTO, don't aggressively inline code, etc.)
Rust binaries also dynamically link to and rely on this runtime.
just to have everything in one file? how to show how to do it with nix?
because it seem simpler to have a separate C++ file, and a simple shell script or makefile to compile it.
e.g. although I could figure out roughly what the .nix file does, many more people would know plain unix shell than nix.
and where is $out defined in the .nix file?
$out is a magic variable in nix that means the output of the derivation - the directory that nix moves to its final destination
I'll try with `panic = "abort"` for our next release, thanks for the reminder.
If you just think about how roughly (napkin math) 2MB can be 100k loc, that’s nuts
Further I think everyone keeps getting larger and larger memory because software keeps getting more and more bloated.
I remember when 64gb iPhone was more than enough (I don’t take pictures so just apps and data) Now my 128 is getting uncomfortable due to the os and app sizes. My next phone likely will be a 256
> First because it’s kind of a fallacy. Size is absolute not relative to something. Especially for software. No one thinks of software size primarily in the context of their disk space.
That’s exactly how most people think about file sizes.
When your disk is full, you don’t delete the smallest files first. You delete the biggest.
> Further I think everyone keeps getting larger and larger memory because software keeps getting more and more bloated.
RAM sizes have actually stagnated over the last decade.
> I remember when 64gb iPhone was more than enough (I don’t take pictures so just apps and data) Now my 128 is getting uncomfortable due to the os and app sizes. My next phone likely will be a 256
That’s because media sizes increase, not executable sizes.
And people do want higher resolution cameras, higher definition videos, improved audio quality, etc. These are genuinely desirable features.
Couple that with improved internet bandwidth allowing for content providers to push higher bitrate media, however the need to still locally cache media.
Part of it is app sizes on mobile. But it's apps in the 200mb - 2gb range that are the problem, not ones that single-digit megabytes.
The 2GB apps are usually so large because they include high quality media assets. For example, Spotify will frequently consumer multiple GBs of storage but the vast majority of that is audio cache.
I agree that the largest data use tends to be media assets.
I’m not trying to dismiss your point here. Genuinely curious how you’ve accumulated so many app installs.
Looking through at categories of app where I have multiple, I'm seeing:
- Transport provider apps (Airlines, Trains, Buses, Taxis etc)
- Parking payment apps
- Food delivery apps
- Hotel apps
- Payment apps
- Messaging / Video calling apps
- Banking apps
- Mapping apps
It's especially easy to accumulate a lot of apps if you travel through multiple countries, as for a lot of these apps you need different ones in different countries.
This is wrong. The reason why many old tools are so small was because you had far less space. If you have a 20tb harddrive you wouldn't care about whether ls took up 1kb or 2mb, on a 1gb harddrive it matters/ed much more.
Optimization takes time, I'm sure if OP wanted he could shrink the binary size by quite a lot but doing so has its costs and nowadays its rarely worth paying that since nobody even notices wether a program is 2kb or 2mb. It doesn't matter anymore in the age of 1TB bootdrives.
What incentive does Apple have to help iOS devs get package sizes down, then?
https://lib.rs/crates/lstr
says for this one the deps clock in at: ~19–29MB ~487K SLoC
I do love the compiler and support tools built into Cargo (fmt, clippy, etc.).
Side-note: what theme are you using in the linked gif? It's right in the middle of my two favorite themes, onedark and gruvbox.
I'm using eza (aka exa), aliased as ls, which has "tree" built in (aliased as lt), amongst others, as replacement for "ls" and it's one of my biggest production boosts in daily commandline use. Because eza has the tree built in, and the tree is also insanely fast, I won't be needing this tool - yet. Maybe one day the interactive mode will pull me over.
Congrats on releasing. And kudo's to how well you've released it: solid README, good description, good-looking gifs with exactly the right feature highlights
https://github.com/Canop/broot
[0]: https://github.com/solidiquis/erdtree [1]: https://github.com/Canop/broot