Ready to dive into the lake?
lakeFS is currently only
available on desktop.

For an optimal experience, provide your email below and one of our lifeguards will send you a link to start swimming in the lake!

lakeFS Community
Barak Amar
Barak Amar Author

October 20, 2021

Overview

The templating package text/template implements data-driven templates for generating textual output. Although we do not benefit from executing the template output more than once, we found it easy to use and helpful for outputting text with colors, marshaling data, and rendering tabular information.

By mapping additional functions by name, it is possible to extend the template engine with more functionality. A function can accept input from the template engine as an argument and return a value that will be rendered into the output.

The mapping should be done before we call the template Parse function.

To illustrate how this can be done, let’s look at the following example:

rand.Seed(time.Now().UnixNano())
tmpl := template.Must(template.
	New("").
	Funcs(map[string]interface{}{
    	"rand": func() int {
        	return rand.Intn(100)
    	},
	}).
	Parse(`Hi {{.}}, you are number {{rand}}.`))
_ = tmpl.Execute(os.Stdout, "User")
Hi User, you are number 6.

In this case we map the rand function to a rand.Intn function that returns a random number between 0 and 100 (excluding). The template engine will accept the return value and stringify it to the output.

The functions that the template can accept should have a valid name that can be used as part of the template (consist from letters and digits or underscore, should not start with digit) and return value type or value with error.

Let us take a look at some use-cases from our CLI now.

Colorful Output

In order to get some colorful text, we use the go-pretty package.

One of the benefits of using this package are the colorized text functions it provides and the option to disable/enable the color support completely.

Let’s map some colors to our template

data := struct {
	Passed int
	Failed int
}{
	Passed: 1,
	Failed: 5,
}
tmpl := template.Must(template.
	New("").
	Funcs(map[string]interface{}{
    	"red": func(v interface{}) string {
        	return text.FgHiRed.Sprint(v)
    	},
    	"green": func(v interface{}) string {
        	return text.FgHiGreen.Sprint(v)
    	},
    	"yellow": func(v interface{}) string {
        	return text.FgHiYellow.Sprint(v)
    	},
	}).
	Parse(`{{ "Results" | yellow }}
Passed: {{ .Passed | green }}
Failed: {{ .Failed | red }}
`))
_ = tmpl.Execute(os.Stdout, data)

Using the term package, we can detect if our output goes to a terminal or not. While the user will want to see the output with colors over the terminal, redirecting the output or pipe it through another command to process, will not like to process the color escape codes produced.

By checking the terminal we can disable the colors over the go-prettypackage and running the above will produce the same output with text only:

if !term.IsTerminal(int(os.Stdout.Fd())) {
   text.DisableColors()

Data as JSON

A common use-case is needing to print out a data model – configuration, server response, or other complex structure. JSON is often used for this purpose.

We can easily render our output by adding a json function to our mapping:

data := struct {
	ID int `json:"id"`
	UpdateTime time.Time `json:"update_time"`
	Path string `json:"path,omitempty"`
}{
	ID: 1,
	UpdateTime: time.Now(),
	Path: "path/to/data",
}
tmpl := template.Must(template.
	New("").
	Funcs(map[string]interface{}{
    	"json":	func(v interface{}) (string, error) {
        	b, err := json.MarshalIndent(v, "", "  ")
        	if err != nil {return "", err}
        	return string(b), nil
    	},

	}).
	Parse(`Record information {{ . | json }}`))
_ = tmpl.Execute(os.Stdout, data)
Record information {
    "id": 1,
    "update_time": "2021-10-18T21:18:25.973140953+03:00",
    "path": "path/to/data"
}

Note that the mapped function in this case also returns an error. The Execute method will return an error if our call to MarshalIndent fails.

Tables

Processing and displaying data in a tabular format is common. Having a common data structure to render a table is used by the function that we will supply to our template.

// type table.Row interface{} - holds any value

type Table struct {
	Headers table.Row
	Rows	[]table.Row
}

The Table structure consists of headers and rows to model any tabular information.

The number of headers reflects the number of cells in each row (we assume it is aligned).

Like we did with our colored output, rendering a table into the terminal will not be the same as rendering it to a file.

According to the value of isTerminal (assuming it is set by term.IsTerminal), we will render the aligned table or CSV format, so that it can be easily processed.

tmpl := template.Must(template.
	New("").
	Funcs(map[string]interface{}{
    	"table": func(tab *Table) string {
        	w := table.NewWriter()
        	w.AppendHeader(tab.Headers)
        	w.AppendRows(tab.Rows)
        	if isTerminal {
            	return w.Render()
        	}
        	return w.RenderCSV()
    	},
	}).
	Parse(`{{ . | table }}`))
tbl := &Table{
	Headers: table.Row{"id", "path"},
	Rows:	[]table.Row{{1, "file1"}, {2, "file2"}, {3, "file3"}},
}
_ = tmpl.Execute(os.Stdout, tbl)

Output in a terminal:

+----+-------+
| ID | PATH  |
+----+-------+
|  1 | file1 |
|  2 | file2 |
|  3 | file3 |
+----+-------+

Output for non-terminal:

id,path
1,file1
2,file2
3,file3

Additional Ideas

Looking back at code always raises ideas of how to improve the code or make it easier to use. For example:

  • Use reflect to model tabular information – tags or additional structure to describe the model. Will prevent transformation to Table if not needed.
  • Render table using continuous data source – APIs that pull information often rely on pagination. Using reflection to render the data and capture the way we pull data, we can have a common way to fetch and display data continuously.

About lakeFS

The lakeFS project is an open source technology that provides a git-like version control interface for data lakes, with seamless integration to popular data tools and frameworks.

Our mission is to maximize the manageability of open source data analytics solutions that scale.

To learn more...

Read Related Articles.

Need help getting started?

Git for Data – lakeFS

  • Get Started
    Get Started