Make Ubuntu Terminal Look Like Kali Linux [2023]

Tutorial To Make Ubuntu Terminal Look Like Kali Linux

In this Ubuntu Tutorial Post, we will help you to make your Ubuntu terminal look like Kali Linux. Ubuntu is a popular Linux-based operating system and is most preferred by beginners while Kali Linux is popular for security purposes.

Make Ubuntu Terminal Look Like Kali Linux

The first step is to install zsh shell, zshell plugins and some additional tools in your Ubuntu.

Installing ZSH on Ubuntu 22.04 LTS

The Z shell is an interactive shell that incorporates many features like bash, fish, dash, and ksh. Run the following command to install ZSH on Ubuntu-based operating systems.

sudo apt update
sudo apt install zsh

After installation is complete,  run the zsh command to switch from the bash prompt. You will see a z shell configuration prompt where you need to select option zero from the prompt and hit enter.

Now, it’s time to install Z shell plugins.

sudo apt install zsh-syntax-highlighting zsh-autosuggestions

Run the following command to change your default login shell to z shell.

chsh -s /bin/zsh

Modifying the zshrc file

If you want to customize then you need to tweak the zshrc file and which is located in your home directory (~/.zshrc). zshrc is a hidden file so you need to configure your system to see the hidden file. Sometimes the file may not be available, so you must create or delete it if it is already available.

To delete the file run the following command:

rm ~/.zshrc

To create a new file simply run this command:

touch ~/.zshrc

After you have successfully created the .zshrc file, open it with your preferred text editor, copy and paste the following content into your .zshrc file, and save.

# ~/.zshrc file for zsh interactive shells.

# see /usr/share/doc/zsh/examples/zshrc for examples

setopt autocd # change directory just by typing its name

#setopt correct # auto correct mistakes

setopt interactivecomments # allow comments in interactive mode

setopt magicequalsubst # enable filename expansion for arguments of the form ‘anything=expression’

setopt nonomatch # hide error message if there is no match for the pattern

setopt notify # report the status of background jobs immediately

setopt numericglobsort # sort filenames numerically when it makes sense

setopt promptsubst # enable command substitution in prompt


WORDCHARS=${WORDCHARS///} # Don't consider certain characters part of the word


# hide EOL sign ('%')

PROMPT_EOL_MARK=""

# configure key keybindings

bindkey -e # emacs key bindings

bindkey ' ' magic-space # do history expansion on space

bindkey '^U' backward-kill-line # ctrl + U

bindkey '^[[3;5~' kill-word # ctrl + Supr

bindkey '^[[3~' delete-char # delete

bindkey '^[[1;5C' forward-word # ctrl + ->

bindkey '^[[1;5D' backward-word # ctrl + <-

bindkey '^[[5~' beginning-of-buffer-or-history # page up

bindkey '^[[6~' end-of-buffer-or-history # page down

bindkey '^[[H' beginning-of-line # home

bindkey '^[[F' end-of-line # end

bindkey '^[[Z' undo # shift + tab undo last action

# enable completion features

autoload -Uz compinit

compinit -d ~/.cache/zcompdump

zstyle ':completion:*:*:*:*:*' menu select

zstyle ':completion:*' auto-description 'specify: %d'

zstyle ':completion:*' completer _expand _complete

zstyle ':completion:*' format 'Completing %d'

zstyle ':completion:*' group-name ''

zstyle ':completion:*' list-colors ''

zstyle ':completion:*' list-prompt %SAt %p: Hit TAB for more, or the character to insert%s

zstyle ':completion:*' matcher-list 'm:{a-zA-Z}={A-Za-z}'

zstyle ':completion:*' rehash true

zstyle ':completion:*' select-prompt %SScrolling active: current selection at %p%s

zstyle ':completion:*' use-compctl false

zstyle ':completion:*' verbose true

zstyle ':completion:*:kill:*' command 'ps -u $USER -o pid,%cpu,tty,cputime,cmd'

# History configurations

HISTFILE=~/.zsh_history

HISTSIZE=1000

SAVEHIST=2000

setopt hist_expire_dups_first # delete duplicates first when HISTFILE size exceeds HISTSIZE

setopt hist_ignore_dups # ignore duplicated commands history list

setopt hist_ignore_space # ignore commands that start with space

setopt hist_verify # show command with history expansion to user before running it

#setopt share_history # share command history data

# force zsh to show the complete history

alias history="history 0"


# configure `time` format

TIMEFMT=$'nrealt%Enusert%Unsyst%Sncput%P'

# make less more friendly for non-text input files, see lesspipe(1)

#[ -x /usr/bin/lesspipe ] && eval "$(SHELL=/bin/sh lesspipe)"

# set variable identifying the chroot you work in (used in the prompt below)

if [ -z "${debian_chroot:-}" ] && [ -r /etc/debian_chroot ]; then

debian_chroot=$(cat /etc/debian_chroot)

fi

# set a fancy prompt (non-color, unless we know we "want" color)

case "$TERM" in

xterm-color|*-256color) color_prompt=yes;;

esac
# uncomment for a colored prompt, if the terminal has the capability; turned

# off by default to not distract the user: the focus in a terminal window

# should be on the output of commands, not on the prompt

force_color_prompt=yes

if [ -n "$force_color_prompt" ]; then

if [ -x /usr/bin/tput ] && tput setaf 1 >&/dev/null; then

# We have color support; assume it's compliant with Ecma-48

# (ISO/IEC-6429). (Lack of such support is extremely rare, and such

# a case would tend to support setf rather than setaf.)

color_prompt=yes

else

color_prompt=

fi

fi

configure_prompt() {

prompt_symbol=㉿

# Skull emoji for root terminal

#[ "$EUID" -eq 0 ] && prompt_symbol=💀

case "$PROMPT_ALTERNATIVE" in

twoline)

PROMPT=$'%F{%(#.blue.green)}┌──${debian_chroot:+($debian_chroot)─}${VIRTUAL_ENV:+($(basename $VIRTUAL_ENV))─}(%B%F{%(#.red.blue)}%n'$prompt_symbol$'%m%b%F{%(#.blue.green)})-[%B%F{reset}%(6~.%-1~/…/%4~.%5~)%b%F{%(#.blue.green)}]n└─%B%(#.%F{red}#.%F{blue}$)%b%F{reset} '

# Right-side prompt with exit codes and background processes

#RPROMPT=$'%(?.. %? %F{red}%B⨯%b%F{reset})%(1j. %j %F{yellow}%B⚙%b%F{reset}.)'

;;

oneline)

PROMPT=$'${debian_chroot:+($debian_chroot)}${VIRTUAL_ENV:+($(basename $VIRTUAL_ENV))}%B%F{%(#.red.blue)}%n@%m%b%F{reset}:%B%F{%(#.blue.green)}%~%b%F{reset}%(#.#.$) '

RPROMPT=

;;

backtrack)

PROMPT=$'${debian_chroot:+($debian_chroot)}${VIRTUAL_ENV:+($(basename $VIRTUAL_ENV))}%B%F{red}%n@%m%b%F{reset}:%B%F{blue}%~%b%F{reset}%(#.#.$) '

RPROMPT=

;;

esac

unset prompt_symbol

}

# The following block is surrounded by two delimiters.

# These delimiters must not be modified. Thanks.

# START KALI CONFIG VARIABLES

PROMPT_ALTERNATIVE=twoline

NEWLINE_BEFORE_PROMPT=yes

# STOP KALI CONFIG VARIABLES

if [ "$color_prompt" = yes ]; then

# override default virtualenv indicator in prompt

VIRTUAL_ENV_DISABLE_PROMPT=1

configure_prompt

# enable syntax-highlighting

if [ -f /usr/share/zsh-syntax-highlighting/zsh-syntax-highlighting.zsh ]; then

. /usr/share/zsh-syntax-highlighting/zsh-syntax-highlighting.zsh

ZSH_HIGHLIGHT_HIGHLIGHTERS=(main brackets pattern)

ZSH_HIGHLIGHT_STYLES[default]=none

ZSH_HIGHLIGHT_STYLES[unknown-token]=fg=white,underline

ZSH_HIGHLIGHT_STYLES[reserved-word]=fg=cyan,bold

ZSH_HIGHLIGHT_STYLES[suffix-alias]=fg=green,underline

ZSH_HIGHLIGHT_STYLES[global-alias]=fg=green,bold

ZSH_HIGHLIGHT_STYLES[precommand]=fg=green,underline

ZSH_HIGHLIGHT_STYLES[commandseparator]=fg=blue,bold

ZSH_HIGHLIGHT_STYLES[autodirectory]=fg=green,underline

ZSH_HIGHLIGHT_STYLES[path]=bold

ZSH_HIGHLIGHT_STYLES[path_pathseparator]=

ZSH_HIGHLIGHT_STYLES[path_prefix_pathseparator]=

ZSH_HIGHLIGHT_STYLES[globbing]=fg=blue,bold

ZSH_HIGHLIGHT_STYLES[history-expansion]=fg=blue,bold

ZSH_HIGHLIGHT_STYLES[command-substitution]=none

ZSH_HIGHLIGHT_STYLES[command-substitution-delimiter]=fg=magenta,bold

ZSH_HIGHLIGHT_STYLES[process-substitution]=none

ZSH_HIGHLIGHT_STYLES[process-substitution-delimiter]=fg=magenta,bold

ZSH_HIGHLIGHT_STYLES[single-hyphen-option]=fg=green

ZSH_HIGHLIGHT_STYLES[double-hyphen-option]=fg=green

ZSH_HIGHLIGHT_STYLES[back-quoted-argument]=none

ZSH_HIGHLIGHT_STYLES[back-quoted-argument-delimiter]=fg=blue,bold

ZSH_HIGHLIGHT_STYLES[single-quoted-argument]=fg=yellow

ZSH_HIGHLIGHT_STYLES[double-quoted-argument]=fg=yellow

ZSH_HIGHLIGHT_STYLES[dollar-quoted-argument]=fg=yellow

ZSH_HIGHLIGHT_STYLES[rc-quote]=fg=magenta

ZSH_HIGHLIGHT_STYLES[dollar-double-quoted-argument]=fg=magenta,bold

ZSH_HIGHLIGHT_STYLES[back-double-quoted-argument]=fg=magenta,bold

ZSH_HIGHLIGHT_STYLES[back-dollar-quoted-argument]=fg=magenta,bold

ZSH_HIGHLIGHT_STYLES[assign]=none

ZSH_HIGHLIGHT_STYLES[redirection]=fg=blue,bold

ZSH_HIGHLIGHT_STYLES[comment]=fg=black,bold

ZSH_HIGHLIGHT_STYLES[named-fd]=none

ZSH_HIGHLIGHT_STYLES[numeric-fd]=none

ZSH_HIGHLIGHT_STYLES[arg0]=fg=cyan

ZSH_HIGHLIGHT_STYLES[bracket-error]=fg=red,bold

ZSH_HIGHLIGHT_STYLES[bracket-level-1]=fg=blue,bold

ZSH_HIGHLIGHT_STYLES[bracket-level-2]=fg=green,bold

ZSH_HIGHLIGHT_STYLES[bracket-level-3]=fg=magenta,bold

ZSH_HIGHLIGHT_STYLES[bracket-level-4]=fg=yellow,bold

ZSH_HIGHLIGHT_STYLES[bracket-level-5]=fg=cyan,bold

ZSH_HIGHLIGHT_STYLES[cursor-matchingbracket]=standout

fi

else

PROMPT='${debian_chroot:+($debian_chroot)}%n@%m:%~%(#.#.$) '

fi

unset color_prompt force_color_prompt


toggle_oneline_prompt(){

if [ "$PROMPT_ALTERNATIVE" = oneline ]; then

PROMPT_ALTERNATIVE=twoline

else

PROMPT_ALTERNATIVE=oneline

fi

configure_prompt

zle reset-prompt

}

zle -N toggle_oneline_prompt

bindkey ^P toggle_oneline_prompt


# If this is an xterm set the title to user@host:dir

case "$TERM" in

xterm*|rxvt*|Eterm|aterm|kterm|gnome*|alacritty)

TERM_TITLE=$'e]0;${debian_chroot:+($debian_chroot)}${VIRTUAL_ENV:+($(basename $VIRTUAL_ENV))}%n@%m: %~a'

;;

*)

;;

esac

precmd() {

# Print the previously configured title

print -Pnr -- "$TERM_TITLE"

# Print a new line before the prompt, but only if it is not the first line

if [ "$NEWLINE_BEFORE_PROMPT" = yes ]; then

if [ -z "$_NEW_LINE_BEFORE_PROMPT" ]; then

_NEW_LINE_BEFORE_PROMPT=1

else

print ""

fi

fi

}


# enable color support of ls, less and man, and also add handy aliases

if [ -x /usr/bin/dircolors ]; then

test -r ~/.dircolors && eval "$(dircolors -b ~/.dircolors)" || eval "$(dircolors -b)"

export LS_COLORS="$LS_COLORS:ow=30;44:" # fix ls color for folders with 777 permissions


alias ls='ls --color=auto'

#alias dir='dir --color=auto'

#alias vdir='vdir --color=auto'

alias grep='grep --color=auto'

alias fgrep='fgrep --color=auto'

alias egrep='egrep --color=auto'

alias diff='diff --color=auto'

alias ip='ip --color=auto'

export LESS_TERMCAP_mb=$'E[1;31m' # begin blink

export LESS_TERMCAP_md=$'E[1;36m' # begin bold

export LESS_TERMCAP_me=$'E[0m' # reset bold/blink

export LESS_TERMCAP_so=$'E[01;33m' # begin reverse video

export LESS_TERMCAP_se=$'E[0m' # reset reverse video

export LESS_TERMCAP_us=$'E[1;32m' # begin underline

export LESS_TERMCAP_ue=$'E[0m' # reset underline

# Take advantage of $LS_COLORS for completion as well

zstyle ':completion:*' list-colors "${(s.:.)LS_COLORS}"

zstyle ':completion:*:*:kill:*:processes' list-colors '=(#b) #([0-9]#)*=0=01;31'

fi

# some more ls aliases

alias ll='ls -l'

alias la='ls -A'

alias l='ls -CF'

# enable auto-suggestions based on the history

if [ -f /usr/share/zsh-autosuggestions/zsh-autosuggestions.zsh ]; then

. /usr/share/zsh-autosuggestions/zsh-autosuggestions.zsh

# change suggestion color

ZSH_AUTOSUGGEST_HIGHLIGHT_STYLE='fg=#999'

fi


# enable command-not-found if installed

if [ -f /etc/zsh_command_not_found ]; then

. /etc/zsh_command_not_found

fi

compinit

To make the changes take effect, run the following command:

$ source .zshrc
$ . .zshrc

Now, Download color schemes and themes with the following command:

git clone https://github.com/linuxopsys/ubuntu-to-kali-terminal.git
cd ubuntu-to-kali-terminal

Now extract the compressed files:

tar -xvf color-schemes.tar
tar -xvf kali-dark-theme.tar

You will notice a new directory name “usr” in your current working directory.

Now you need to remove the qtermwidget5 directory located in the /usr/share directory and replace it with one from the extracted tar file.

sudo rm -rf /usr/share/qtermwidget5
sudo mv -f usr/share/qtermwidget5 /usr/share

Now its time to Change Qterminal settings

  • Open your Qterminal preferences
  • Change the color scheme to Kali-Dark and then press the Apply button
  • At the bottom of your terminal’s Appearance settings, Adjust the “Application transparency” from “0%” to “5%,”
  • Click apply for the changes to take effect.

Now, it’s time to change the Ubuntu theme. Run the following command to move the Kali-Dark directory from the directory you extracted to /usr/share/themes directory.

sudo mv -f usr/share/themes/Kali-Dark /usr/share/themes

Now, you can tweak your setting by going to the gnome-tweaks by running the following command:

gnome-tweaks

Meanwhile, If you have Gnome Tweak Tool installed in Ubuntu then go through the following article:

Install Gnome Tweak Tool On Ubuntu

Using the Make Utility and Makefiles in Linux [Guide]

Using the Make Utility and Makefiles in Linux [Guide]

This is a complete beginner’s guide to using the make command in Linux.

You’ll learn:

  • The purpose of the make command
  • Installation of the make command
  • Creating and using the makefile for a sample C project

What is the make utility?

The make utility is one of the handiest utilities for a programmer. Its primary purpose is to compile a medium-to-large software project. The make utility is so helpful and versatile that even the Linux kernel uses it!

To understand the usefulness of the make utility, one must first understand why it was needed in the first place.

As your software gets more extensive, you start relying more and more on external dependencies (i.e., libraries). Your code starts splitting into multiple files with God knows what is in each file. Compiling each file and linking them together sanely to produce necessary binaries becomes complicated.

“But I can create a Bash script for that!”

Why yes, you can! More power to you! But as your project grows, you must deal with incremental rebuilds. How will you handle it generically, such that the logic stays true even when your number of files increases?

This is all handled by the make utility. So let us not reinvent the wheel and see how to install and make good use of the make utility.

Installing the make utility

The make utility is already available in the first-party repositories of almost all Linux distributions.

To install make on Debian, Ubuntu, and their derivatives, use the apt package manager like so:

sudo apt install make

To install make on Fedora and RHEL-based Linux distributions, use the dnf package manger like so:

sudo dnf install make

To install make on Arch Linux and its derivatives, use the pacman package manager like so:

sudo pacman -Sy make

Now that the make utility is installed, you can proceed to understand it with examples.

Creating a basic makefile

The make utility compiles your code based on the instructions specified in the makefile in the top level directory of your project’s code repository.

Below is the directory structure of my project:

$ tree make-tutorial

make-tutorial
└── src
    ├── calculator.c
    ├── greeter.c
    ├── main.c
    └── userheader.h

1 directory, 4 files

Below are the contents of the main.c source file:

#include <stdio.h>

#include "userheader.h"

int main()
{
    greeter_func();

    printf("nAdding 5 and 10 together gives us '%d'.n", add(5, 10));
    printf("Subtracting 10 from 32 results in '%d'.n", sub(10, 32));
    printf("If 43 is  multiplied with 2, we get '%d'.n", mul(43, 2));
    printf("The result of dividing any even number like 78 with 2 is a whole number like '%f'.n", div(78, 2));

    return 0;
}

Next are the contents of the greeter.c source file:

#include <stdio.h>

#include "userheader.h"

void greeter_func()
{
    printf("Hello, user! I hope you are ready for today's basic Mathematics class!n");
}

Below are the contents of the calculator.c source file:

#include <stdio.h>

#include "userheader.h"

int add(int a, int b)
{
    return (a + b);
}

int sub(int a, int b)
{
    if (a > b)
        return (a - b);
    else if (a < b)
        return (b - a);
    else return 0;
}

int mul(int a, int b)
{
    return (a * b);
}

double div(int a, int b)
{

    if (a > b)
        return ((double)a / (double)b);
    else if (a < b)
        return ((double)b / (double)a);
    else
        return 0;
}

Finally, below are the contents of the userheader.h header file:

#ifndef USERHEADER_DOT_H
#define USERHEADER_DOT_H

void greeter_func();

int add(int a, int b);
int sub(int a, int b);
int mul(int a, int b);
double div(int a, int b);

#endif /* USERHEADER_DOT_H */

Basics of a makefile

Before we create a bare-bones makefile, let us take a look at the syntax of a makefile. The basic building block of a Makefile consists of one or many “rules” and “variables”.

Rules in a makefile

Let us first take a look at rules in the makefile. A rule of makefile has the following syntax:

target : prerequisites
    recipe
    ...
  • A target is the name of file that will be generated by make. These are usually object files that are later on used for linking everything together.
  • A prerequisite is a file that is necessary for the target to be generated. This is where you usually specify your .c, .o and .h files.
  • Finally, a recipe is one or many steps needed to generate the target.

Macros/Variables in makefile

In C and C++, a basic language feature is variables. They allow us to store values that we might want to use in a lot of places. This helps us use the same variable name where needed. An added benefit is we only need to make one change if we need to change the value.

Similarly, a makefile can contain variables. They are sometimes referred to as macros. The syntax to declare a variable in a Makefile is as follows:

variable = value

A variable and the value(s) it holds are separated by an equals (=) sign. Multiple values are separated by spaces between each other.

In general, variables are used to store various items necessary for compilation. Let’s say that you want to enable run-time buffer overflow detection and enable full ASLR for the executable; this can be achieved by storing all the compiler flags in one variable, like CFLAGS.

Below is a demonstration doing this:

CFLAGS = -D_FORTIFY_SOURCE=2 -fpie -Wl,-pie

We created a variable called CFLAGS (compiler flags) and added all of our compiler flags here.

To use our variable, we can enclose it in parentheses beginning with a dollar sign, like so:

gcc $(CFLAGS) -c main.c

The above line in our makefile will add all of our specified compiler flags and compile the main.c file as we require.

Automatic variables

The make utility has a few automatic variables to help ease repetition even further. These variables are commonly used in a rule’s recipe.

Some of the automatic variables are as follows:

Automatic variables Meaning
$@ Name of the rule of target. Usually used to specify the output filename.
$< Name of the first pre-requisite.
$? Names of all pre-requisites that are newer than the target. i.e. files that have been modified after the most recent code compilation
$^ Names of all pre-requisites with spaces between them.

You can find the full list of the automatic variables on GNU Make’s official documentation.

Implicit Variables

Like the automatic variables covered above, make also has some variables that have a set use. As I previously used the CFLAGS macro/variable to store compiler flags, there are other variables that have an assumed use.

This can be thought not of as “reserved keywords” but more like the “general consensus” of naming variables.

These conventional variables are as follows:

Implicit variables Description
VPATH Make utility’s equivalent of Bash’s PATH variable. Paths are separated by the colon sign (:). This is empty by default.
AS This is the assembler. The default is the as assembler.
CC The program for compiling C files. The default is cc. (Usually, cc points to gcc.)
CXX The program for compiling C++ files. The default is the g++ compiler.
CPP The program that runs the C pre-processor. The default is set to $(CC) -E.
LEX The program that turns Lexical grammars into source code. The default is lex. (You should change this to flex.)
LINT The program that lints your source code. The default is lint.
RM The command to remove a file. The default is rm -f. (Please pay strong attention to this!)
ASFLAGS This contains all the flags for the assembler.
CFLAGS This contains all the flags for the C compiler (cc).
CXXFLAGS This contains all the flags for the C++ compiler (g++).
CPPFLAGS This contains all the flags for the C pre-processor.
.PHONY Specify targets that do not resembe name of a file. An example is the “make clean” target; where clean is a value of .PHONY

Comments in a makefile

Comments in a makefile are like those in a shell script. They start with the pound/hash symbol (#) and the contents of said line (after the pound/hash symbol) are considered as a comment by the make utility and is ignored.

Below is an example demonstrating this:

CFLAGS = -D_FORTIFY_SOURCE=2 -fpie -Wl,-pie
# The '-D_FORTIFY_SOURCE=2' flag enables run-time buffer overflow detection
# The flags '-fpie -Wl,-pie' are for enabling complete address space layout randomization

Initial draft of a makefile

Now that I have described the basic syntax of a makefile’s elements and also the dependency tree of my simple project, let us now write a very bare-bones Makefile to compile our code and link everything together.

Let us start with setting up the CFLAGS, CC and the VPATH variables that are necessary for our compilation. (This is not the complete makefile. We will be building this progressively.)

CFLAGS = -Wall -Wextra
CC = gcc
VPATH = src

With that done, let us define our rules for building. I will create 3 rules, for each .c file. My executable binary will be called make_tutorial but yours can be whatever you want!

CFLAGS = -Wall -Wextra
CC = gcc
VPATH = src


make_tutorial : main.o calculator.o greeter.o
        $(CC) $(CFLAGS) $? -o $@

main.o : main.c
        $(CC) $(CFLAGS) -c $? -o $@

calculator.o : calculator.c
        $(CC) $(CFLAGS) -c $? -o $@

greeter.o : greeter.c
        $(CC) $(CFLAGS) -c $? -o $@

As you can see, I am compiling all the .c files into object files (.o) and linking them together at the end.

When we run the make command, it will start with the first rule (make_tutorial). This rule is to create a final executable binary of the same name. It has 3 prerequisite object files for each .c files.

Each consecutive rule after the make_tutorial rule is creating an object file from the source file of same the name. I can understand how complex this feels. So let us break down each of these automatic and implicit variables and understand what they mean.

  • $(CC): Calls the GNU C Compiler (gcc).
  • $(CFLAGS): An implicit variable to pass in our compiler flags like -Wall, etc.
  • $?: Names of all prerequisite files that are newer than the target. In the rule for main.o, $? will expand to main.c IF main.c has been modified after main.o had been generated.
  • $@: This is the target name. I am using this to omit typing the rule name twice. In rule for main.o, $@ expands to main.o.

Finally, the options -c and -o are gcc‘s options for compiling/assembling source files without linking and specifying an output file name respectively. You can check this by running the man 1 gcc command in your terminal.

Now let’s try and run this makefile and hope it works on first try!

$ make
gcc -Wall -Wextra -c src/main.c -o main.o
gcc -Wall -Wextra -c src/calculator.c -o calculator.o
gcc -Wall -Wextra -c src/greeter.c -o greeter.o
gcc -Wall -Wextra main.o calculator.o greeter.o -o make_tutorial

If you look closely, each step of compilation contains all the flags we specified in the CFLAGS implicit variable. We can also see that the source files were automatically sourced from the “src” directory. This occurred automatically because we specified “src” in the VPATH implicit variable.

Let’s try and run the make_tutorial binary and verify if everything works as intended.

$ ./make_tutorial
Hello, user! I hope you are ready for today's basic Mathematics class!

Adding 5 and 10 together gives us '15'.
Subtracting 10 from 32 results in '22'.
If 43 is  multiplied with 2, we get '86'.
The result of dividing any even number like 78 with 2 is a whole number like '39.000000'.

via GIPHY

Improving the makefile

“What is there to improve?”
Let us run the ls command you can see that for yourself 😉

$ ls --group-directories-first -1
src
calculator.o
greeter.o
main.o
Makefile
make_tutorial

Do you see the build artifacts (object files)? Yeah, they can clutter things up for the worse. Let’s use our build directory and reduce this clutter.

Below is the modified makefile:

CFLAGS = -Wall -Wextra
CC = gcc
VPATH = src:build


make_tutorial : main.o calculator.o greeter.o
        $(CC) $(CFLAGS) $? -o $@

build/main.o : main.c
        mkdir build
        $(CC) $(CFLAGS) -c $? -o $@

build/calculator.o : calculator.c
        $(CC) $(CFLAGS) -c $? -o $@

build/greeter.o : greeter.c
        $(CC) $(CFLAGS) -c $? -o $@

Here, I have made one simple change: I added the build/ string before each rule that generates an object file. This will put each object file inside the “build” directory. I also added “build” to the VPATH variable.

If you look closely, our first compilation target is make_tutorial. But it will not be the target that is pedantically the first. The first target whose recipe runs is main.o (or rather build/main.o). Therefore, I added the “mkdir build” command as a recipe in the main.o target.

If I were to not create the “build” directory, I would get the following error:

$ make
gcc -Wall -Wextra -c src/main.c -o build/main.o
Assembler messages:
Fatal error: can't create build/main.o: No such file or directory
make: *** [Makefile:12: build/main.o] Error 1

Now that we have modified our makefile, let us remove the current build artifacts along with the compiled binary and rerun the make utility.

$ rm -v *.o make_tutorial
removed 'calculator.o'
removed 'greeter.o'
removed 'main.o'
removed 'make_tutorial'

$ make
mkdir build
gcc -Wall -Wextra -c src/main.c -o build/main.o
gcc -Wall -Wextra -c src/calculator.c -o build/calculator.o
gcc -Wall -Wextra -c src/greeter.c -o build/greeter.o
gcc -Wall -Wextra build/main.o build/calculator.o build/greeter.o -o make_tutorial

This compiled perfectly! If you look closely, we had already specified the “build” directory in the VPATH variable, making it possible for the make utility to search for our object files inside the “build” directory.

Our source and header files were automatically found from the “src” directory and the build artifacts (object files) were kept inside and linked from the “build” directory, just as we intended.

Adding .PHONY targets

We can take this improvement one step further. Let’s add the “make clean” and “make run” targets.

Below is our final makefile:

CFLAGS = -Wall -Wextra
CC = gcc
VPATH = src:build


build/bin/make_tutorial : main.o calculator.o greeter.o
        mkdir build/bin
        $(CC) $(CFLAGS) $? -o $@

build/main.o : main.c
        mkdir build
        $(CC) $(CFLAGS) -c $? -o $@

build/calculator.o : calculator.c
        $(CC) $(CFLAGS) -c $? -o $@

build/greeter.o : greeter.c
        $(CC) $(CFLAGS) -c $? -o $@


.PHONY = clean
clean :
        rm -rvf build


.PHONY = run
run: make_tutorial
        ./build/bin/make_tutorial

Everything about the build targets is the same, except for a change where I specify that I want the make_tutorial binary executable file placed inside the build/bin/ directory.

Then, I set .PHONY variable to clean, to specify that clean is not a file that the make utility needs to worry about. It is… phony. Under the clean target, I specify what must be removed to “clean everything”.

I do the same for the run target. If you are a Rust developer you will like this pattern. Like the cargo run command, I use the make run command to run the compiled binary.

For us to run the make_tutorial binary, it must exist. So I added it to the prerequisite for the run target.

Let’s run make clean first and then run make run directly!

$ make clean
rm -rvf build
removed 'build/greeter.o'
removed 'build/main.o'
removed 'build/calculator.o'
removed 'build/bin/make_tutorial'
removed directory 'build/bin'
removed directory 'build'

$ make run
mkdir build
gcc -Wall -Wextra -c src/main.c -o build/main.o
gcc -Wall -Wextra -c src/calculator.c -o build/calculator.o
gcc -Wall -Wextra -c src/greeter.c -o build/greeter.o
mkdir build/bin
gcc -Wall -Wextra build/main.o build/calculator.o build/greeter.o -o build/bin/make_tutorial
./build/bin/make_tutorial
Hello, user! I hope you are ready for today's basic Mathematics class!

Adding 5 and 10 together gives us '15'.
Subtracting 10 from 32 results in '22'.
If 43 is  multiplied with 2, we get '86'.
The result of dividing any even number like 78 with 2 is a whole number like '39.000000'.

As you see here, we did not run the make command to compile our project first. Upon running the make run, compilation was taken care of. Let’s understand how it happened.

Upon running the make run command, the make utility first looks at the run target. A prerequisite for the run target is our binary file that we compile. So our make_tutorial binary file gets compiled first.

The make_tutorial has its own prerequisites which are placed inside the build/ directory. Once those object files are compiled, our make_tutorial binary is compiled; finally, the Make utility returns back to the run target and the binary file ./build/bin/make_tutorial is executed.

such elegance much wow

Conclusion

This article covers the basics of a makefile, a file that the make utility depends on, to simplify compilation of your software repository. This is done by starting from a basic Makefile and building it as our needs grow.

Comparing Different Business Software Solutions: How to Make the Right Choice

Making the decision to invest in business software can be a difficult one. With so many different options on the market, it can be hard to know which one is right for your company. In this article, we will compare some of the most popular software solutions and help you decide which one is best for you. We’ll discuss the benefits of each option and help you figure out which one is right for your needs. So, whether you’re looking for accounting software, CRM software, or something else entirely, read on to find out more!

ERP Software

ERP software is a type of business software that helps organizations manage their business processes. It includes modules for accounting, CRM, inventory management, and more. ERP software is often used by large businesses because it can be customized to fit the specific needs of the organization. However, this customization can also be one of the drawbacks of ERP software, as it can be expensive to implement and maintain. If you look at this full comparison report from TEC, you’ll notice that there are several features you need to look for in ERP software. This includes things like ease of use, flexibility, and scalability. You’ll also want to make sure that the software you choose integrates well with your existing systems.

CRM Software

CRM software is used to manage customer relationships. It helps businesses keep track of customer contact information, communication history, and sales data. CRM software can be used to automate marketing and sales tasks, including email marketing, lead generation, and follow-up activities. Picking the right CRM software for your business can be a daunting task. There are a lot of different options on the market, and it can be hard to know which one is right for you. First, you need to decide what features are most important to your business. Then, you can start narrowing down your options by comparing different software solutions.

Accounting Software

Accounting software is a vital tool for businesses of all sizes. It helps businesses keep track of their finances, make tax payments, and more. When choosing accounting software, there are a few things to consider. First, decide what features you need. Accounting software can do everything from tracking income and expenses to creating invoices and paying bills. Make sure the software you choose has the features you need to manage your finances effectively. Second, consider your budget. Accounting software can range in price from a few hundred dollars to several thousand dollars. Choose the software that fits your budget without compromising on features. Finally, make sure the software is compatible with your devices. Many accounting programs are available as desktop applications or online applications. Make sure the program you choose will work with the devices you have available.

Database Software

Database software is used to store, organize, and retrieve data. When choosing a database software solution for your business, it is important to consider the features that are important to you and your business needs. Some features to consider include the type of data you will be storing (e.g., financial data, customer data), the size of your database (i.e., how much data you need to store), and whether you need a relational or non-relational database. It is also important to consider the scalability of the solution and whether it can grow with your business. Another thing to keep in mind when choosing a database software solution is the level of support you need. Some solutions offer 24/ Seven support, while others only offer support during business hours. There are also some solutions that offer community-based support, which can be helpful if you are comfortable getting help from other users.

How To Choose The Right Business Software Solution

When it comes to choosing the right business software solution, there are a few things you need to keep in mind. First, decide what features are most important to your business. Make a list of the must-have features and the nice-to-have features. Then, start looking at different software solutions and compare them side-by-side. Pay attention to things like price, ease of use, and compatibility with your existing systems. Once you’ve found a few software solutions that meet your needs, take some time to try them out. Many software companies offer free trials of their products. This is a great way to test out the features and see how the software works before you commit to buying it. When you’re ready to make a purchase, be sure to read the reviews. Look for reviews from other businesses that are similar to yours. This will give you a good idea of how the software will work for you and whether it’s worth the price.

When it comes to choosing the right business software solution, there are a few things you need to keep in mind. By taking the time to consider your needs and compare different solutions, you can find the software that’s right for you. With the right software in place, you can streamline your business operations and improve your bottom line.

How to Make Your Apps Start in Specified Workspace in Ubuntu 22.04

When you try to open an application in Ubuntu, Fedora or other Linux with GNOME, it’s possible to specify in which desktop workspace the app window should start!

Most operating systems today have multiple desktops to organize unrelated ongoing projects. In Ubuntu Linux, we usually called them “workspaces”. For most common used applications, user may even let them start automatically in specified workspace to improve workflow efficiency.

For Ubuntu 22.04:

1. Firstly, search for and install “Extension Manager” from Ubuntu Software.

Install Extension Manager in Ubuntu 22.04

2. Once installed, press on Super (Windows logo key) on keyboard to open ‘Activities’ overview. Then search for and launch the tool:

3. When it opens, navigate to “Browse” tab, type ‘Auto Move Windows‘ in search box and hit Enter. Finally, click the install the extension.

4. Once installed, navigate back to ‘Installed’ tab in Extension Manager and click on the gear button for that extension you just installed.

In pop-up dialog, use ‘+’ icon to add apps and set the workspace numbers for each app.

NOTE 1: The search function in app selection dialog seems broken, you have to browse through all apps manually.

NOTE 2: GNOME desktop by default has only 2 workspaces, and adds more automatically when the last is in use.

To set workspace number to ‘3’, ‘4’, or bigger, open “Settings (GNOME Control Center) -> Multitasking”, enable “Fix number of workspaces” and set a number for it.

For old Ubuntu, Fedora, Arch Linux, etc

Other GNOME based Linux can also install the “Auto Move Windows” extension directly by using the on/off switch in web browser.

(Ubuntu Only) First, press Ctrl+Alt+T on keyboard to open terminal. When it opens, run command to install the agent:

sudo apt install chrome-gnome-shell

Next, open the link button below and turn on the slider icon to install the extension:

Install browser extension if you don’t see the on/off switch in that page, and click refresh.

Finally, install “Gnome Extensions” app in either Ubuntu Software / Gnome Software, and use it to open the configuration dialog.

Install & Use “Gnome Extensions” app to configure it

Tip: Ubuntu 20.04 user may use “Gnome Tweaks” (available in Ubuntu Software) to configure the number for workspaces.

Salt Security Platform Enhancements Make it Easier to Operationalise API Security

Salt Security, the leading API security company, has announced new enhancements to its next-generation Salt Security API Protection Platform, extending abilities in threat detection and pre-production API testing. The latest features include deeper and earlier insights into attacker behaviours and attack patterns, visual depictions of API call sequences, and support for attack simulation ahead of releasing APIs into production. With the new capabilities, Salt enhances its market-leading capabilities in runtime protection, providing organisations a more comprehensive view of API usage and the API attack surface so they can improve their business understanding and accelerate incident response time.

Building upon its existing threat detection and monitoring algorithms, the Salt platform provides organizations with quick, automatic, and continuous visibility into any risks or vulnerabilities within their API ecosystem. Customers can more easily spot and block API attacks before bad actors can reach their objective, and they can also more quickly identify unusual API usage patterns and remediate API vulnerabilities.

New features in the Salt Security API Protection Platform include:

  • Threat hunting capabilities within more detailed attacker timelines – Salt continues to be the only API security company that creates a consolidated attacker timeline. New platform capabilities support threat hunting and better illumination of the sequence of attacker steps, enabling organisations to conduct faster incident analysis and expedite remediation efforts. 
  • Visualization of API Call Sequences – Salt becomes the first API security vendor to offer a visual depiction of the various paths that API calls are following. This visualisation makes clear how users are interacting with APIs, revealing actions that should and should not be allowed, how users or services are entering digital systems, usage that shouldn’t be allowed, API design flaws, and other usage details.  
  • Contextual API security testing – Salt is making robust attack simulation capabilities available across runtime, pre-production, and development cycles. These simulations can help organisations identify business logic flaws early in the lifecycle, and integration with CI/CD systems means developers can address security gaps before releasing APIs.  

In the Salt Security State of API Security Report, Q1 2022, 86% of respondents admitted to lacking the confidence in knowing which APIs expose sensitive data. Identifying and monitoring for API vulnerabilities in real-time is crucial for protecting companies’ vital assets so they can focus on business operations instead of risk.

“Bad actors work tirelessly to refine their tactics and techniques to make threats more difficult to detect. Successfully defending against modern, sophisticated API attacks requires solutions that can swiftly detect illegitimate activity and behavioural abnormalities in real-time,” said Elad Koren, Chief Product Officer, Salt Security. “Our latest platform capabilities deliver critical insights sooner and across the full API lifecycle. With increased context over time, combined with automated threat alerts, organizations can better defend themselves against attacks and fix API vulnerabilities before they can be exploited.”

The post Salt Security Platform Enhancements Make it Easier to Operationalise API Security appeared first on IT Security Guru.

Ways To Make A Small Business More Competitive

Are you looking to make your small business more competitive this year? It is not easy as a small business when you have to compete with the bigger, more established brands (as well as other smaller brands), and this can make it hard to survive. It will always be a struggle for a smaller busi-ness, but you should know that there are steps that you can take that will help you to improve and start competing with the bigger brands. Consumers often prefer smaller businesses when they find one that they can trust, so there is the potential for high levels of success as a small business.

Find A Gap In The Market

Finding a gap in the market is perhaps the most powerful way to become more competitive, as you will be able to lure customers away from the big brands to your business. It is not easy finding a gap in the market, but a lot has changed in recent times, so you might find that now is a good time to try. This will involve conducting fresh market research that will hopefully help you to unearth a new opportunity that will make your business stand out from the crowd and appeal to your target market.

Prioritize Customer Service

One of the reasons that consumers often prefer to use a smaller business is that customer service standards are often higher, and consumers feel more valued. Therefore, you need to lean into this and make your business one that is famous for its customer service. It should be quick and easy for people to get in touch with the business through various channels, and you should then have customer service staff who will resolve issues swiftly, make the customer feel heard, and always be willing to go the extra mile. This should help you to retain customers, attract new ones through word-of-mouth marketing and develop a positive reputation.

Start A Referral Program

Leading on from this, word-of-mouth marketing is a powerful tool as people will always trust rec-ommendations from friends, family and colleagues. Therefore, it is helpful to start a referral program that will allow you to reach a much larger group and find people that you might have other-wise never been able to reach. Not only this, but this builds instant credibility that will benefit any small business. A referral program should help you to attract new customers to your business and achieve higher levels of success.

Embrace Social Media

Social media is perfect for smaller businesses because it is a level playing field, and it can be a great way to engage your target market and improve your reputation. You should be active across social media, sharing updates from the business, sharing original content (more on this below), and participating in discussions. Try to find ways to humanize your brand, which will help to create a stronger connection with your target market.

Create High-Quality Content

Every business is creating content as a way to increase its presence online, but you will find that few businesses are creating high-quality content that will appeal to their target market. Content should be viewed as a way to impress and engage your target market as opposed to simply a way to become more visible, so you need to think about the kind of content that would appeal to your target market and take your time to create quality content. This can improve your reputation, add value to the lives of your customers and help you to promote your business.

Earn A Masters In Business Analytics

To compete with the bigger brands, you need to be able to make the right decisions and develop strategies that will take your business forward. A masters in business analytics will show you how to make the most out of data so that you can spot patterns and trends, create forecasts and identify the best ways to fine-tune and improve your business for higher levels of success. A masters in business analytics will help you to thrive in a modern-day marketplace where data can help your business to improve in all areas.

Form A Strategic Alliance

You will often see small businesses teaming up as a way to take on bigger brands, and this can be an effective way to become more competitive. The key to a strategic alliance is identifying a busi-ness that has the same target market as you but is not a direct competitor. If you are a web design company, for example, then you could form a strategic alliance with a digital marketing company. This would involve you promoting one another’s businesses and sharing resources so that you can both achieve higher levels of success without getting in each other’s way.

Make Improvements Based On Feedback

The best businesses are the ones that seek feedback and then make improvements based on what they get back from their customers. This is smart because it helps you to improve and keep your customers happy, plus it shows that you value the input of your customers, and this should help you to develop a positive reputation. It is not always easy to identify how you can improve, so asking your customers for their opinions is helpful and should allow you to make positive changes to the business.

Look After Your Employees

It is hard to improve as a business and compete at a higher level if you constantly have to go through the recruitment process. You need stability in the workforce, which means that you need to look after your employees and keep them happy. Flexible working, feedback, a high-quality office space and career development are a few of the best ways to keep your team happy and engaged. You should also be using training as a way to improve your employees and take the business for-ward.

This post should give you hope and a few ideas for ways that you can improve your small business and become more competitive.

Make Directory Only if it Doesn’t Exist Already in Linux

Make Directory Only if it Doesn't Exist Already in Linux

To create a directory in Linux, you use the mkdir command. It stands for ‘make directory’ after all. It’s quite a simple command.

But, when you try to create a directory that already exists, you get an error like:

$ mkdir dir0
mkdir: cannot create directory ‘dir0’: File exists

This indicates that if a directory already exists, the mkdir command will not try to create or modify the already existing directory in any way.

But it also shows an error which is not what you always want especially if you are writing a bash script.

To prevent this error, use the -p flag along with mkdir command.

mkdir -p dirname

This way, though the directory is not created, it won’t show any error as well. Your scripts will run as usual.

💡
You’ll get the same error if a file or link exists with the same name as the directory you are trying to create. Why? Because everything is a file in Linux. A directory is a special file that acts like an index of all the files that are inside it.

Let’s see things in a bit more detail here and discuss various ways of preventing this error.

Method 1: Use ‘-p’ option

If you look at the manpage of the mkdir command, it has the ‘-p’ flag.

The use of ‘-p’ flag in mkdir is “no error if existing, make parent directories as needed”.

When you use the ‘-p’ flag, the mkdir utility will check if a directory, file, link or pipe with the same name does not exist already.

If the does exist, it will not modify your existing directory or file, nor will it show an error message.

mkdir -p dirname
no output because there exists none

This is very handy when you are creating custom bash scripts and don’t want its execution to stop because of this error.

The -p can also be used to create a nested directory structure. If you want to create dir1/dir2/dir3 like directory structure and some or none of the directories exist in the hierarchy.

Method 2: Check if directory already exists in bash

If you are writing bash scripts, you can use the if condition and check if the directory already exists. If it doesn’t then you create the directory.

Here’s a sample code:

if [ -d my_dir ]
then
    mkdir my_dir
fi

Method 3: Send it to the void

A majority of UNIX tools have two output streams, stdout and stderr. Normally, both streams, stdout and stderr get printed to the terminal. But you can redirect either the normal output stream or the error stream to another file.

So, when the mkdir command throws an error to your terminal, you can redirect it to the void.

To redirect stdout, use it along with the ‘1’ numerical stream descriptor, for stderr, use the numerical stream descriptor ‘2’. You can also redirect stdin by using the ‘0’ stream descriptor.

To actually redirect the output, use the appropriate stream descriptor along with the redirection operator ‘>’

$ mkdir dir0 2> /dev/null

This will send the output of stderr to the ‘/dev/null’ device which discards anything that gets written to it.

This is totally safe to do. As I mentioned earlier, if the directory already exists, then it will not be modified. Only an error message will be shown. All you are doing here is suppressing that error message.

Personally, I would go with the first method i.e. using the mkdir -p command for ensuring that the directory is created only if there is no file or directory of the same name in the same location.