import "github.com/bitfield/script"
script
is a Go library for doing the kind of tasks that shell scripts are good at: reading files, executing subprocesses, counting lines, matching strings, and so on.
Why shouldn't it be as easy to write system administration programs in Go as it is in a typical shell? script
aims to make it just that easy.
Shell scripts often compose a sequence of operations on a stream of data (a pipeline). This is how script
works, too.
This is one absolutely superb API design. Taking inspiration from shell pipes and turning it into a Go library with syntax this clean is really impressive.
—Simon Willison
Read more: Scripting with Go
If you're already familiar with shell scripting and the Unix toolset, here is a rough guide to the equivalent script
operation for each listed Unix command.
Unix / shell | script equivalent |
---|---|
(any program name) | Exec() |
[ -f FILE ] |
IfExists() |
> |
WriteFile() |
>> |
AppendFile() |
$* |
Args() |
basename |
Basename() |
cat |
File() / Concat() |
cut |
Column() |
dirname |
Dirname() |
echo |
Echo() |
grep |
Match() / MatchRegexp() |
grep -v |
Reject() / RejectRegexp() |
head |
First() |
find -type f |
FindFiles |
jq |
JQ |
ls |
ListFiles() |
sed |
Replace() / ReplaceRegexp() |
sha256sum |
SHA256Sum() / SHA256Sums() |
tail |
Last() |
uniq -c |
Freq() |
wc -l |
CountLines() |
xargs |
ExecForEach() |
Let's see some simple examples. Suppose you want to read the contents of a file as a string:
contents, err := script.File("test.txt").String()
That looks straightforward enough, but suppose you now want to count the lines in that file.
numLines, err := script.File("test.txt").CountLines()
For something a bit more challenging, let's try counting the number of lines in the file that match the string "Error":
numErrors, err := script.File("test.txt").Match("Error").CountLines()
But what if, instead of reading a specific file, we want to simply pipe input into this program, and have it output only matching lines (like grep
)?
script.Stdin().Match("Error").Stdout()
Just for fun, let's filter all the results through some arbitrary Go function:
script.Stdin().Match("Error").FilterLine(strings.ToUpper).Stdout()
That was almost too easy! So let's pass in a list of files on the command line, and have our program read them all in sequence and output the matching lines:
script.Args().Concat().Match("Error").Stdout()
Maybe we're only interested in the first 10 matches. No problem:
script.Args().Concat().Match("Error").First(10).Stdout()
What's that? You want to append that output to a file instead of printing it to the terminal? You've got some attitude, mister.
script.Args().Concat().Match("Error").First(10).AppendFile("/var/log/errors.txt")
If the data is JSON, we can do better than simple string-matching. We can use JQ queries:
script.File("commits.json").JQ(".[0] | {message: .commit.message, name: .commit.committer.name}").Stdout()
Suppose we want to execute some external program instead of doing the work ourselves. We can do that too:
script.Exec("ping 127.0.0.1").Stdout()
But maybe we don't know the arguments yet; we might get them from the user, for example. We'd like to be able to run the external command repeatedly, each time passing it the next line of input. No worries:
script.Args().ExecForEach("ping -c 1 {{.}}").Stdout()
If there isn't a built-in operation that does what we want, we can just write our own:
script.Echo("hello world").Filter(func (r io.Reader, w io.Writer) error {
n, err := io.Copy(w, r)
fmt.Fprintf(w, "\nfiltered %d bytes\n", n)
return err
}).Stdout()
// Output:
// hello world
// filtered 11 bytes
Notice that the "hello world" appeared before the "filtered n bytes". Filters run concurrently, so the pipeline can start producing output before the input has been fully read.
If we want to scan input line by line, we could do that with a Filter
function that creates a bufio.Scanner
on its input, but we don't need to:
script.Echo("a\nb\nc").FilterScan(func(line string, w io.Writer) {
fmt.Fprintf(w, "scanned line: %q\n", line)
}).Stdout()
// Output:
// scanned line: "a"
// scanned line: "b"
// scanned line: "c"
And there's more. Much more. Read the docs for full details, and more examples.
Let's use script
to write a program that system administrators might actually need. One thing I often find myself doing is counting the most frequent visitors to a website over a given period of time. Given an Apache log in the Common Log Format like this:
212.205.21.11 - - [30/Jun/2019:17:06:15 +0000] "GET / HTTP/1.1" 200 2028 "https://example.com/ "Mozilla/5.0 (Linux; Android 8.0.0; FIG-LX1 Build/HUAWEIFIG-LX1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.156 Mobile Safari/537.36"
we would like to extract the visitor's IP address (the first column in the logfile), and count the number of times this IP address occurs in the file. Finally, we might like to list the top 10 visitors by frequency. In a shell script we might do something like:
cut -d' ' -f 1 access.log |sort |uniq -c |sort -rn |head
There's a lot going on there, and it's pleasing to find that the equivalent script
program is quite brief:
package main
import (
"github.com/bitfield/script"
)
func main() {
script.Stdin().Column(1).Freq().First(10).Stdout()
}
Let's try it out with some sample data:
16 176.182.2.191
7 212.205.21.11
1 190.253.121.1
1 90.53.111.17
See pkg.go.dev for the full documentation, or read on for a summary.
These are functions that create a pipe with a given contents:
Source | Contents |
---|---|
Args |
command-line arguments |
Echo |
a string |
Exec |
command output |
File |
file contents |
FindFiles |
recursive file listing |
IfExists |
do something only if some file exists |
ListFiles |
file listing (including wildcards) |
Slice |
slice elements, one per line |
Stdin |
standard input |
Filters are methods on an existing pipe that also return a pipe, allowing you to chain filters indefinitely. The filters modify each line of their input according to the following rules:
Filter | Results |
---|---|
Basename |
removes leading path components from each line, leaving only the filename |
Column |
Nth column of input |
Concat |
contents of multiple files |
Dirname |
removes filename from each line, leaving only leading path components |
Echo |
all input replaced by given string |
Exec |
filtered through external command |
ExecForEach |
execute given command template for each line of input |
Filter |
user-supplied function filtering a reader to a writer |
FilterLine |
user-supplied function filtering each line to a string |
FilterScan |
user-supplied function filtering each line to a writer |
First |
first N lines of input |
Freq |
frequency count of unique input lines, most frequent first |
Join |
replace all newlines with spaces |
JQ |
result of jq query |
Last |
last N lines of input |
Match |
lines matching given string |
MatchRegexp |
lines matching given regexp |
Reject |
lines not matching given string |
RejectRegexp |
lines not matching given regexp |
Replace |
matching text replaced with given string |
ReplaceRegexp |
matching text replaced with given string |
SHA256Sums |
SHA-256 hashes of each listed file |
Note that filters run concurrently, rather than producing nothing until each stage has fully read its input. This is convenient for executing long-running comands, for example. If you do need to wait for the pipeline to complete, call Wait
.
Sinks are methods that return some data from a pipe, ending the pipeline and extracting its full contents in a specified way:
Sink | Destination | Results |
---|---|---|
AppendFile |
appended to file, creating if it doesn't exist | bytes written, error |
Bytes |
data as []byte , error |
|
CountLines |
number of lines, error | |
Read |
given []byte |
bytes read, error |
SHA256Sum |
SHA-256 hash, error | |
Slice |
data as []string , error |
|
Stdout |
standard output | bytes written, error |
String |
data as string , error |
|
Wait |
none | |
WriteFile |
specified file, truncating if it exists | bytes written, error |
Version | New |
---|---|
v0.20.0 | JQ |
See the contributor's guide for some helpful tips if you'd like to contribute to the script
project.
Gopher image by MariaLetta