Advanced Command line Reference Guide for Developers
Complete Command Line Reference Guide for Developers
Mastering advanced shell commands can streamline your workflow when managing projects, inspecting files, and handling builds. This guide groups related commands by functionality, providing usage examples and explanations.
Table of Contents
- File Listing and Filtering
- Viewing File Contents
- File Creation and Basic Manipulation
- File Copying, Moving, and Deletion
- Text Replacement and Editing
- Searching Files and Content
- Advanced Text Processing
- Understanding Pipes and Command Chaining
- File Permissions and Ownership
- File Comparison and Differences
- File Compression and Archives
- File Linking
- NPM Package Management
- Git Version Control
- Process Management
- File Cleanup and Disk Usage
- Code Analysis and Documentation
- Monitoring and Debugging
- JSON and Data Processing
- Directory Visualization
- Build Commands
- Troubleshooting Common Issues
- Network and DNS Troubleshooting (Windows)
- System Management and Shutdown Commands
1. File Listing and Filtering
1.1 List all files with filtering
ls -la | grep pattern → List all files (including hidden) in long format and filter for lines containing the pattern.
- Useful for quickly locating configuration or content-related files in a directory.
- Note:
grepwill match the pattern anywhere in the line - in filenames, permissions, dates, etc.
Basic example:
ls -la src/ | grep config
- Explanation: Lists all files and directories in the
src/folder with detailed information (permissions, owner, size, date), then filters the output to show only lines that contain "config". - Sample output:
-rwxrwxrwx 1 user user 2043 Oct 9 21:37 content.config.ts
Example showing both files and folders:
ls -la src/ | grep content
- Sample output:
drwxrwxrwx 1 user user 512 Oct 10 11:36 content
-rwxrwxrwx 1 user user 4089 Oct 10 11:39 content.config.ts
1.2 Understanding ls -la output format
To determine if a matching line is a folder or a file, examine the very first character of the output line from ls -la:
-
dat the start → Directory (folder)- Example:
drwxrwxrwx 1 user user 512 Oct 10 11:36 content - The
dindicates this is a directory named "content"
- Example:
-
-at the start → Regular file- Example:
-rwxrwxrwx 1 user user 4089 Oct 10 11:39 content.config.ts - The
-indicates this is a regular file named "content.config.ts"
- Example:
-
Other first characters:
l→ Symbolic linkb→ Block devicec→ Character devicep→ Named pipes→ Socket
Output format breakdown:
-rwxrwxrwx 1 user user 4089 Oct 10 11:39 content.config.ts
│└────┬───┘ │ │ │ │ │ └─ Filename
│ Permissions│ │ │ │ └─ Modification date/time
│ Number of links │ └─ File size (bytes)
│ Owner └─ Group owner
└─ File type
1.3 Filter for only directories (folders)
ls -la | grep '^d' → Show only directories.
- The
^dpattern means "lines that begin with 'd'" ^is a regex anchor meaning "start of line"
Example: Filter for directories containing "content"
ls -la src/ | grep '^d' | grep 'content'
Explanation:
ls -la src/- List all items in long formatgrep '^d'- Filter for all lines that begin with the characterd(directories only)- The output of the first grep is then piped (
|) to the second grep - Second
grep 'content'- Filters for lines that also contain the word "content"
Sample output:
drwxrwxrwx 1 user user 512 Oct 10 11:36 content
drwxrwxrwx 1 user user 256 Oct 12 09:15 content-backup
Alternative (more concise):
ls -la src/ | grep '^d.*content'
- This uses a single grep with regex:
^d.*contentmeans "start with 'd', followed by any characters, then 'content'"
1.4 Filter for only regular files
Method 1: Using grep '^-' → Show only regular files.
ls -la src/ | grep '^-' | grep 'content'
Explanation:
ls -la src/- List all items in long formatgrep '^-'- Filter for lines beginning with '-' (regular files only)grep 'content'- Further filter for lines containing "content"
Sample output:
-rwxrwxrwx 1 user user 4089 Oct 10 11:39 content.config.ts
-rwxrwxrwx 1 user user 2156 Oct 11 14:22 content.types.ts
Method 2: Using grep -v '^d' → Show everything EXCEPT directories.
ls -la src/ | grep -v '^d' | grep 'content'
Explanation:
grep -v '^d'- This command inverts the search (-vflag)- Shows all lines that do NOT start with 'd'
- This will show regular files, symlinks, and other file types (but not directories)
Sample output:
-rwxrwxrwx 1 user user 4089 Oct 10 11:39 content.config.ts
-rwxrwxrwx 1 user user 2156 Oct 11 14:22 content.types.ts
lrwxrwxrwx 1 user user 15 Oct 12 10:00 content-link -> ../content
Comparison of methods:
grep '^-'→ Only shows regular files (excludes symlinks, devices, etc.)grep -v '^d'→ Shows everything except directories (includes symlinks, devices, etc.)
1.5 Advanced filtering combinations
Find only executable files:
ls -la | grep '^-' | grep 'x'
Find directories modified today:
ls -la | grep '^d' | grep "$(date '+%b %d')"
Find files larger than 1KB (with size column):
ls -lah | grep '^-' | awk '$5 ~ /K|M|G/ {print $0}'
Count number of directories:
ls -la | grep '^d' | wc -l
Count number of files:
ls -la | grep '^-' | wc -l
2. Viewing File Contents
Inspect file contents without opening them in an editor.
2.1 Display entire file
cat file → Display entire file contents.
- Example:
cat package.jsonto view the entire file.- Explanation: Prints all contents of
package.jsonto the terminal. - Sample output: (Entire file contents displayed)
- Explanation: Prints all contents of
2.2 View first lines of a file
head -n <number> file → Display the first n lines of a file.
- Default is 10 lines if number not specified.
- Ideal for previewing imports, schemas, or headers in code files.
- Example:
head -5 content.config.tsto see initial lines of a TypeScript config.- Explanation: Shows the first 5 lines of
content.config.ts, helping to quickly check the file's starting content like imports or definitions. - Sample output:
import {defineCollection, z} from 'astro:content';
const blog = defineCollection({
type: 'content',
schema: z.object({
- Explanation: Shows the first 5 lines of
Using pipes with head:
cat file | head -n <number> → Pipe file contents to head command.
-
Example:
cat src/content.config.ts | head -10to view first 10 lines.- Explanation: Reads the entire file with
catand pipes (|) the output tohead, which then displays only the first 10 lines. - Sample output:
import {defineCollection, z} from 'astro:content';
const blog = defineCollection({
type: 'content',
schema: z.object({
title: z.string(),
description: z.string().optional().nullable(),
date: z.date(),
tags: z.array(z.string()).or(z.string()).optional().nullable(),
category: z.array(z.string()).or(z.string()).default('uncategorized').nullable(),
- Explanation: Reads the entire file with
-
Note:
head fileis more efficient thancat file | headbecauseheadcan read the file directly without loading the entire file into memory first. However,cat | headis useful when chaining multiple commands together.
2.3 View last lines of a file
tail -n <number> file → Display the last n lines of a file.
- Useful for checking recent log entries or end of files.
- Example:
tail -20 error.logto see the most recent errors.- Explanation: Shows the last 20 lines of the log file.
- Sample output: (Last 20 lines of error.log)
Using pipes with tail:
- Example:
cat package.json | tail -5to view last 5 lines.- Explanation: Pipes file contents to
tailwhich shows only the last 5 lines.
- Explanation: Pipes file contents to
2.4 Follow file in real-time
tail -f file → Follow a file in real-time (watch for new additions).
- Essential for monitoring log files as they're being written.
- Press Ctrl+C to stop watching.
- Example:
tail -f /var/log/nginx/access.logto monitor web server access.- Explanation: Continuously displays new lines as they're added to the log file.
- Sample output: (New log entries appear in real-time)
Using pipes with grep for filtered monitoring:
- Example:
tail -f server.log | grep ERRORto watch only error messages.- Explanation: Combines real-time file following with pattern matching to show only lines containing "ERROR".
2.5 View file with navigation
less file → View file with navigation (scrollable, searchable).
- Use arrow keys or Page Up/Down to navigate,
/to search,qto quit. - Better than
catfor large files. - Example:
less large-log.txtto browse through a big file.- Explanation: Opens file in a pager that allows scrolling and searching.
- Sample output: (Interactive viewer opens)
2.6 View file page by page
more file → View file page by page (simpler than less).
- Press Space for next page,
qto quit. - Example:
more documentation.txtto read documentation.- Explanation: Displays file one screen at a time.
- Sample output: (File displayed page by page)
3. File Creation and Basic Manipulation
3.1 Create or overwrite file from stdin
cat > file → Create or overwrite a file with input from stdin (end with Ctrl+D).
- Creates a new file or completely replaces existing file content.
- Example:
cat > newfile.txtthen type content and press Ctrl+D.- Explanation: Redirects standard input to
newfile.txt, allowing you to type multi-line text until Ctrl+D is pressed. - Sample workflow:
$ cat > newfile.txt
This is my first line.
This is my second line.
[Press Ctrl+D to save] - Sample output: (Creates
newfile.txtwith the entered content)
- Explanation: Redirects standard input to
3.2 Append to file from stdin
cat >> file → Append content to an existing file from stdin.
- Adds new content to the end without overwriting existing content.
- Example:
cat >> existing.txtthen type content and press Ctrl+D.- Explanation: Appends standard input to
existing.txtwithout replacing existing content. - Sample workflow:
$ cat >> filename.txt
This is my first line of sample text.
This is the second line.
[Press Ctrl+D to save] - Sample output: (Appends new lines to
existing.txt) - Note: If
filename.txtdoes not exist, it will be created first.
- Explanation: Appends standard input to
3.3 Create or overwrite file with single line
echo "text" > file → Create or overwrite a file with a single line of text.
-
Uses the output redirection operator (
>) to write text to file. -
Example:
echo "Hello World" > hello.txt- Explanation: Writes "Hello World" to
hello.txt, creating or overwriting the file. - Sample output: (Creates
hello.txtwith "Hello World")
- Explanation: Writes "Hello World" to
-
Example:
echo "This is the text to write." > filename.txt- Explanation: Uses echo to output the string and redirects that output to
filename.txt, creating or overwriting the file with the specified text.
- Explanation: Uses echo to output the string and redirects that output to
3.4 Append single line to file
echo "text" >> file → Append a line of text to a file.
-
Uses the append redirection operator (
>>). -
Example:
echo "New entry" >> log.txt- Explanation: Appends "New entry" as a new line to
log.txt. - Sample output: (Adds "New entry" to
log.txt)
- Explanation: Appends "New entry" as a new line to
-
Example:
echo "This text will be appended." >> filename.txt- Explanation: Appends the specified text to the end of
filename.txtwithout overwriting existing content.
- Explanation: Appends the specified text to the end of
3.5 Append formatted text
printf "text\n" >> file → Append formatted text to a file (more control than echo).
- Example:
printf "Name: %s\nAge: %d\n" "John" 25 >> user.txt- Explanation: Formats the string with placeholders and appends it to
user.txt. - Sample output: (Appends "Name: John\nAge: 25\n" to
user.txt)
- Explanation: Formats the string with placeholders and appends it to
3.6 Create empty file
touch file → Create an empty file or update its timestamp.
- Example:
touch empty.log- Explanation: Creates
empty.logif it doesn't exist, or updates its modification time. - Sample output: (Creates
empty.log; no output)
- Explanation: Creates
3.7 Empty existing file
truncate -s 0 file → Empty an existing file (set size to 0).
- Example:
truncate -s 0 debug.log- Explanation: Sets the size of
debug.logto 0, effectively clearing its contents. - Sample output: (Clears
debug.log; no output)
- Explanation: Sets the size of
4. File Copying, Moving, and Deletion
4.1 Copy files
cp source destination → Copy a file or directory.
-
Use
-rflag for recursive copying of directories. -
Use
-pto preserve file attributes (permissions, timestamps). -
Example:
cp config.json config.backup.jsonto create a backup.- Explanation: Creates a copy of
config.jsonnamedconfig.backup.json. - Sample output: (File is copied; no output unless an error)
- Explanation: Creates a copy of
-
Example:
cp -r src/ backup/to copy entire directory.- Explanation: Recursively copies the
src/directory and all its contents tobackup/. - Sample output: (Directory copied; no output unless an error)
- Explanation: Recursively copies the
4.2 Copy without overwriting
cp -n source destination → Copy only if destination doesn't exist (no-clobber).
- Example:
cp -n template.html index.html- Explanation: Copies
template.htmltoindex.htmlonly ifindex.htmldoesn't already exist. - Sample output: (File copied if destination doesn't exist; no output)
- Explanation: Copies
4.3 Move or rename files
mv source destination → Move files to a directory or rename a file.
-
Example:
mv old-component.js new-component.jsto rename.- Explanation: Renames
old-component.jstonew-component.jsin the same directory. - Sample output: (File renamed; no output unless an error)
- Explanation: Renames
-
Example:
mv alipay.svg paypal.svg wechat.svg images/to move multiple files.- Explanation: Moves the listed image files into the
images/directory. - Sample output: (Files moved; no output unless an error)
- Explanation: Moves the listed image files into the
4.4 Delete files
rm file → Remove/delete a file.
-
Use
-fto force deletion without prompts. -
Use
-ror-Rfor recursive directory deletion. -
Example:
rm old-file.txtto delete a single file.- Explanation: Permanently deletes
old-file.txtfrom the filesystem. - Sample output: (File deleted; no output unless an error)
- Explanation: Permanently deletes
-
Example:
rm -rf temp/to force delete a directory and all contents.- Explanation: Recursively deletes the
temp/directory and everything inside without confirmation prompts. - ⚠️ Warning: Use with extreme caution - this cannot be undone!
- Sample output: (Directory deleted; no output)
- Explanation: Recursively deletes the
4.5 Delete with confirmation
rm -i file → Interactive deletion with confirmation prompt.
- Example:
rm -i important.txt- Explanation: Prompts "rm: remove regular file 'important.txt'?" before deletion.
- Sample output:
rm: remove regular file 'important.txt'? y
5. Text Replacement and Editing
Perform in-place string substitutions.
5.1 Basic find and replace
sed -i 's/old/new/g' file → Replace all occurrences of 'old' with 'new' in the file.
- Note: On macOS, use
sed -i '' 's/old/new/g' file(with empty string after -i) - The
gflag means "global" (all occurrences on each line); without it, only first occurrence per line is replaced. - Use different delimiters (like
|or#) when pattern contains slashes.
Understanding sed syntax:
s/old/new/gbreakdown:s= substitute command/old/= pattern to find (NO SPACE needed before 'old')/new/= replacement text (NO SPACE needed before 'new')/g= global flag (replace all occurrences)
- Spaces: Spaces are NOT needed between
/oldand/new. The/acts as the delimiter. - Example with spaces:
sed -i 's/old text/new text/g' file- here spaces ARE part of the search/replace strings themselves
Basic examples:
-
Example 1:
sed -i 's|/spinner\.gif|/images/spinner.gif|g' blog/display-pictures.mdto update image paths.- Explanation: Replaces every instance of
/spinner.gifwith/images/spinner.gif. Note the|delimiter used instead of/to avoid escaping slashes in the path. - Sample output: (File is modified in-place; no output unless an error)
- Explanation: Replaces every instance of
-
Example 2:
sed -i 's/const /let /g' src/app.jsto replace const with let.- Explanation: Changes all
constdeclarations toletthroughout the file. Note the space afterconstandletis part of the pattern. - Sample output: (File modified in-place)
- Explanation: Changes all
-
Example 3:
sed -i 's/http:/https:/g' config.jsonto update protocol.- Explanation: Replaces all
http:withhttps:in the configuration file. - Note: No backslash needed before
:because it's not a special character in sed.
- Explanation: Replaces all
5.2 Delete lines matching pattern
sed -i '/pattern/d' file → Delete lines matching a pattern.
- Example:
sed -i '/console\.log/d' src/app.jsto remove all console.log statements.- Explanation: Deletes every line containing
console.logfrom the file. The.is escaped with\.because.is a special regex character. - Sample output: (Lines removed; no output)
- Explanation: Deletes every line containing
5.3 Print specific line range
sed -n 'start,end p' file → Print specific line range from a file.
- Example:
sed -n '10,20p' large-file.txtto view lines 10-20.- Explanation: Displays only lines 10 through 20 from the file (useful for large files).
- Sample output: (Lines 10-20 displayed)
Using with pipes:
- Example:
cat large-file.txt | sed -n '10,20p'- same result using pipe.
5.4 Case-insensitive replacement
sed -i 's/old/new/gI' file → Case-insensitive replacement.
- The
Iflag makes the pattern matching case-insensitive. - Example:
sed -i 's/todo/FIXME/gI' notes.txtto replace TODO, todo, ToDo, etc.- Explanation: Replaces "todo" in any case combination with "FIXME".
- Sample output: (File modified; no output)
5.5 Perl-based replacement
perl -pi -e 's/old/new/g' file → Alternative to sed with better regex support.
- Example:
perl -pi -e 's/\bcolor\b/colour/g' *.txtto replace "color" with "colour" (word boundaries).- Explanation: Uses word boundaries (
\b) to match only the whole word "color", not "colorful". - Sample output: (Files modified; no output)
- Explanation: Uses word boundaries (
5.6 Viewing files with special characters (cat -A)
cat -A file → Display file contents with special characters visible (tabs, spaces, line endings).
- This flag shows hidden characters, which is useful for debugging file formatting issues.
- Flags breakdown:
-A(equivalent to-vET) → Shows all non-printing characters:$= end of line (newline character)^I= tab character^M= carriage return (often from Windows line endings)- Spaces are shown as spaces (no special symbol)
-n→ Print line numbers before each line (shows line count at end)-b→ Number only non-blank lines-s→ Suppress repeated blank lines (shows only one blank line)-E→ Display$at end of each line only-T→ Display tab characters as^I-v→ Display non-printing characters using^andM-notations
Understanding the output symbols:
$= newline (line ending)^I= tab character^M= carriage return (Windows-style\r\nline endings)- Spaces remain as spaces (not shown with special symbols)
Examples:
-
Example 1:
cat -A src/app/components/todo.component.ts | sed -n '141,144p'to view lines 141-144 with special characters.- Explanation: Displays lines 141-144 from the TypeScript file, showing tabs as
^Iand line endings as$. Useful for detecting formatting inconsistencies. - Sample output:
^Iconstructor(private todoService: TodoService) {$
^I^Ithis.todoService.getTodos().subscribe(todos => {$
^I^I^Ithis.todos = todos;$
^I^I});$- Each
^Irepresents a tab character used for indentation - Each
$shows the line ending - This reveals that the file uses tabs for indentation
- Each
- Explanation: Displays lines 141-144 from the TypeScript file, showing tabs as
-
Example 2:
cat -n src/app/app.module.tsto display file with line numbers.- Explanation: Shows the entire file with line numbers on the left, useful for referencing specific lines during debugging or code review.
- Sample output:
1 import { NgModule } from '@angular/core';
2 import { BrowserModule } from '@angular/platform-browser';
3
4 @NgModule({
5 declarations: [AppComponent],
6 imports: [BrowserModule],
7 })
8 export class AppModule { }
-
Example 3:
cat -A config.jsonto check for Windows line endings in JSON file.- Explanation: If you see
^M$instead of just$, the file has Windows line endings (\r\n). This can cause issues in Unix-like systems. - Sample output with Windows line endings:
{^M$
^I"name": "my-app",^M$
^I"version": "1.0.0"^M$
}^M$ - Sample output with Unix line endings (correct):
{$
^I"name": "my-app",$
^I"version": "1.0.0"$
}$
- Explanation: If you see
-
Example 4:
cat -A src/styles.css | grep " "to find tab characters in CSS file.- Explanation: Combines
cat -Ato show tabs as^I, then pipes togrepto find lines containing actual tab characters. Useful for enforcing consistent indentation (spaces vs tabs). - Sample output: (Lines containing tab characters, shown as
^I)
- Explanation: Combines
-
Example 5:
cat -ns large-file.txtto display file with numbered non-blank lines and squeeze blank lines.- Explanation: Shows line numbers only for non-blank lines (
-n -bcombined) and suppresses multiple consecutive blank lines (-s). Useful for large files with many blank lines. - Sample output: (More compact display of large files)
- Explanation: Shows line numbers only for non-blank lines (
Fixing line ending issues:
- Convert Windows to Unix line endings:
sed -i 's/\r$//' file.txt(removes carriage return) - Verify fix:
cat -A file.txt | tail -5(check last 5 lines for proper line endings)
6. Searching Files and Content
Locate files by name and search for specific strings within them.
6.1 Find files by name
find . -name "pattern" → Find files by name pattern.
-
.represents the current directory. It tellsfindto start searching from the current working directory and include all subdirectories.- You can replace
.with any path like/home/user/projector~/Documentsto search from a different location. - Using
..would search from the parent directory. - Using
/would search from the root directory (entire filesystem).
- You can replace
-
Use
-type ffor files only,-type dfor directories only. -
Use
-inamefor case-insensitive search. -
Example:
find . -name "*.json"to find all JSON files.- Explanation: Recursively searches for files ending in
.jsonfrom current directory (.= current directory). - Sample output:
./package.json
./config/database.json
./src/settings.json
- Explanation: Recursively searches for files ending in
-
Example:
find /var/log -name "*.log" -mtime -7to find logs modified in last 7 days.- Explanation: Finds log files modified within the past week.
- Sample output: (List of recently modified log files)
find . -type f | sort → Find all files in current directory and subdirectories, then sort them alphabetically.
- Explanation: Searches for all files (not directories) recursively from the current location and pipes the output to
sortfor alphabetical ordering. - Sample output:
./README.md
./package.json
./src/App.js
./src/components/Header.js
./src/index.js
./src/utils/helpers.js
find . -type f -name "*.md" -o -name "*.js" -o -name "*.jsx" | sort → Find all Markdown, JavaScript, and JSX files, then sort them.
- Explanation: Searches for files with extensions
.md,.js, or.jsx(using-ofor OR condition) and sorts the results alphabetically. Useful for listing specific file types in a project. - Sample output:
./README.md
./docs/guide.md
./src/App.js
./src/components/Button.jsx
./src/components/Header.js
./src/index.js
6.2 Find files containing pattern
find . -name "*.ext" | xargs grep -l "pattern" → Find files by extension and filter those containing a pattern.
- xargs takes the list of files from
findand passes them as arguments togrep. This is efficient for handling large numbers of files. - Example:
find . \( -name "*.astro" -o -name "*.js" -o -name "*.ts" \) | xargs grep -l "import React"to find files importing React.- Sample output:
src/pages/index.astro
src/components/Header.js
- Sample output:
6.3 Recursive text search
grep -r "pattern" directory → Recursively search for a pattern in files within a directory.
-
Use
-ifor case-insensitive,-nfor line numbers,-vfor inverse match (lines NOT containing pattern). -
Use
-A nto show n lines after match,-B nfor before,-C nfor context (before and after). -
Example:
grep -r "load-mathjax.js" src/to find script references.- Sample output:
src/pages/blog.astro: <script src="/load-mathjax.js"></script>
- Sample output:
-
Example:
grep -rn "TODO" src/ --exclude-dir=node_modulesto find TODO comments with line numbers.- Sample output:
src/app.js:42:// TODO: Refactor this function
src/utils.js:18:// TODO: Add error handling
- Sample output:
6.3a Search for multiple patterns with OR operator
grep -rn "pattern1\|pattern2\|pattern3" directory --include="*.ext" → Search recursively for multiple patterns using OR operator.
-
The
\|(escaped pipe) acts as an OR operator in grep patterns. -
Use
--includeto filter by file extension. -
Example:
grep -rn "ERROR_CODE_404\|ERROR_CODE_500\|ERROR_CODE_503" /var/log/application/handlers/ --include="*.log"to find specific error codes in log files.- Explanation:
grep -rn- recursive search with line numbers"ERROR_CODE_404\|ERROR_CODE_500\|ERROR_CODE_503"- searches for any of these three error codes/var/log/application/handlers/- searches in this directory--include="*.log"- only searches files ending in.log
- Sample output:
/var/log/application/handlers/error_handler.log:142:ERROR_CODE_404: Resource not found at /api/users/123
/var/log/application/handlers/error_handler.log:287:ERROR_CODE_500: Internal server error in database connection
/var/log/application/handlers/api_handler.log:56:ERROR_CODE_503: Service temporarily unavailable
/var/log/application/handlers/request_handler.log:991:ERROR_CODE_404: Endpoint /api/products/xyz not found
- Explanation:
-
Example:
grep -rn "useState\|useEffect\|useContext" src/components/ --include="*.jsx"to find React hooks usage.- Explanation: Searches for any of the three common React hooks in JSX component files.
- Sample output:
src/components/Header.jsx:5:import { useState, useEffect } from 'react';
src/components/Auth.jsx:12: const [user, setUser] = useState(null);
src/components/Theme.jsx:8: const theme = useContext(ThemeContext);
-
Example:
grep -rn "api_key\|secret\|password" config/ --include="*.py"to find potential security issues in configuration files.- Explanation: Searches for sensitive credential keywords that should not be hardcoded.
- Use case: Security audit to ensure no credentials are committed to repository.
Alternative syntax (Extended regex):
grep -Ern "pattern1|pattern2|pattern3" directory --include="*.ext"→ Same as above but using-Eflag for extended regex (no need to escape|).- Example:
grep -Ern "ERROR_CODE_404|ERROR_CODE_500|ERROR_CODE_503" /var/log/application/ --include="*.log"- Explanation:
-Eenables extended regular expressions, so you can use|directly without backslash. - Same results as the basic syntax, but cleaner pattern definition.
- Explanation:
6.4 List files containing pattern
grep -l "pattern" files → List filenames that contain the pattern (not the matching lines).
- Example:
grep -l "import React" src/**/*.jsto find files importing React.- Explanation: Lists only the filenames that contain "import React".
- Sample output:
src/App.js
src/components/Header.js
6.5 Count matching lines
grep -c "pattern" file → Count how many lines match the pattern.
- Example:
grep -c "error" server.logto count error occurrences.- Explanation: Returns the number of lines containing "error".
- Sample output:
23
6.6 Count files containing pattern
grep -l "pattern" files | wc -l → Count how many files contain the pattern.
-
Combines
grep -l(list matching files) withwc -l(count lines) to get total count. -
Example:
grep -l "created_date" src/models/model_builder_*.py | wc -lto count model files containing the "created_date" field.- Explanation:
grep -l "created_date"- searches for files containing "created_date"src/models/model_builder_*.py- matches all Python files starting with "model_builder_"| wc -l- counts how many filenames were returned
- Sample output:
12- Meaning: 12 out of all model_builder files contain the "created_date" field
- Explanation:
-
Example:
grep -l "useState" src/components/*.jsx | wc -lto count React components using hooks.- Explanation: Counts how many JSX component files use the useState hook.
- Sample output:
8(8 components use useState)
-
Example:
grep -l "TODO" src/**/*.ts | wc -lto count TypeScript files with TODO comments.- Explanation: Useful for tracking technical debt across your TypeScript codebase.
- Sample output:
15(15 TypeScript files contain TODO comments)
7. Advanced Text Processing
7.1 Extract columns with awk
awk '{print $column}' file → Extract and print specific columns from text.
-
Powerful for processing structured text data.
-
Example:
ls -l | awk '{print $9}'to list only filenames.- Explanation: Takes the output of
ls -land prints only the 9th column (filename). - Sample output:
config.json
package.json
README.md
- Explanation: Takes the output of
-
Example:
awk -F',' '{print $1,$3}' data.csvto extract columns 1 and 3 from CSV.- Explanation: Uses comma as field separator and prints first and third columns.
- Sample output:
John 25
Jane 30
7.2 Extract fields with cut
cut -d'delimiter' -f field file → Extract specific fields from delimited text.
-
Example:
cut -d':' -f1 /etc/passwdto extract usernames.- Explanation: Splits each line by
:delimiter and extracts the first field. - Sample output:
root
daemon
user
- Explanation: Splits each line by
-
Example:
echo "name,age,city" | cut -d',' -f2to get the second field.- Explanation: Extracts "age" from the comma-separated string.
- Sample output:
age
7.3 Translate or delete characters
tr 'set1' 'set2' → Translate or delete characters from stdin.
-
Example:
echo "hello" | tr 'a-z' 'A-Z'to convert to uppercase.- Explanation: Translates all lowercase letters to uppercase.
- Sample output:
HELLO
-
Example:
cat file.txt | tr -d '\r'to remove carriage returns (Windows line endings).- Explanation: Deletes all
\rcharacters, useful when converting Windows files to Unix format.
- Explanation: Deletes all
-
Example:
echo "hello world" | tr -s ' 'to squeeze repeated spaces.- Explanation: Replaces multiple consecutive spaces with a single space.
- Sample output:
hello world
7.4 Sort lines
sort file → Sort lines of text alphabetically.
-
Use
-nfor numerical sort,-rfor reverse order,-ufor unique lines only. -
Example:
sort names.txtto sort a list alphabetically.- Sample output:
Alice
Bob
Charlie
- Sample output:
-
Example:
ls -l | sort -k5 -nto sort files by size.- Explanation: Sorts
ls -loutput numerically by the 5th column (file size).
- Explanation: Sorts
7.5 Remove duplicate lines
uniq file → Remove duplicate adjacent lines.
-
Usually used after
sortto get truly unique lines. -
Use
-cto count occurrences,-dto show only duplicates. -
Example:
sort data.txt | uniqto get unique sorted lines.- Sample output:
apple
banana
orange
- Sample output:
-
Example:
sort access.log | uniq -c | sort -rnto count and rank by frequency.- Explanation: Counts occurrences of each unique line and sorts by count (highest first).
- Sample output:
15 GET /api/users
8 GET /api/posts
3 POST /api/login
7.6 Count lines, words, characters
wc file → Count lines, words, and characters in a file.
-
Use
-lfor lines only,-wfor words,-cfor bytes,-mfor characters. -
Example:
wc -l app.jsto count lines of code.- Sample output:
245 app.js
- Sample output:
-
Example:
wc -l src/app/components/viewer/viewer.component.tsto count lines in a specific component file.- Explanation: Displays the line count for the TypeScript component file at the specified path. Useful for tracking component size or complexity.
- Sample output:
387 src/app/components/viewer/viewer.component.ts - Use case: Check file size before refactoring, track code growth, or verify component complexity.
-
Example:
find . -name "*.js" | xargs wc -lto count total lines in all JS files.- Sample output:
245 ./app.js
180 ./utils.js
425 total
- Sample output:
8. Understanding Pipes and Command Chaining
8.1 What is a Pipe?
The Pipe Operator | → Sends the output of one command as input to another command.
- Allows you to chain multiple commands together
- Each command processes the output from the previous command
- Fundamental to Unix/Linux command-line philosophy: "do one thing well"
Basic syntax: command1 | command2 | command3
8.2 Common Pipe Usage Patterns
Pattern 1: View first/last lines of output
cat file | head -n → Display first n lines of a file.
-
Example:
cat src/content.config.ts | head -10- Explanation:
cat src/content.config.tsreads the entire file|pipes the output to the next commandhead -10takes that output and shows only the first 10 lines
- Sample output:
import {defineCollection, z} from 'astro:content';
const blog = defineCollection({
type: 'content',
schema: z.object({
title: z.string(),
description: z.string().optional().nullable(),
date: z.date(),
tags: z.array(z.string()).or(z.string()).optional().nullable(),
- Explanation:
-
Direct vs Piped:
head -10 file- Direct (more efficient, reads only needed lines)cat file | head -10- Piped (reads entire file, useful for chaining)
command | tail -n → Display last n lines of output.
- Example:
cat package.json | tail -5- Shows the last 5 lines of package.json
Pattern 2: Filter output
command | grep pattern → Filter output to show only matching lines.
-
Example:
ls -la | grep config- Explanation: Lists all files, then filters to show only lines containing "config"
- Sample output:
-rwxrwxrwx 1 user user 2043 Oct 9 21:37 content.config.ts
-rwxrwxrwx 1 user user 1523 Oct 8 14:22 vite.config.js
-
Example:
ps aux | grep node- Shows only Node.js processes from all running processes
-
Example:
cat error.log | grep ERROR | head -20- Chained pipeline: get file → filter for errors → show first 20
-
Example:
npm run start 2>&1 | grep -E "(Application bundle|compiled successfully|error)" | head -20- Explanation: Monitors development server startup and filters output for key information.
npm run start- starts the development server2>&1- redirects stderr to stdout (captures both normal output and errors)grep -E "(pattern1|pattern2|pattern3)"- filters for bundle info, success messages, or errors using extended regex| head -20- shows only first 20 matching lines to avoid output overflow
- Use case: Quick verification that server started successfully, or catching build errors early without scrolling through verbose logs.
- Sample output:
✓ Application bundle generation complete.
✓ Browser application bundle generation complete.
✓ compiled successfully.
Local: http://localhost:4200/
** Angular Live Development Server is listening on localhost:4200 - Variations:
- React/Vite:
npm start 2>&1 | grep -E "(compiled|Local:|Network:)" | head -15 - Next.js:
npm run dev 2>&1 | grep -E "(ready|started|compiled)" | head -10 - Node.js/Express:
npm start 2>&1 | grep -E "(listening|started|error)" | head -10
- React/Vite:
- Explanation: Monitors development server startup and filters output for key information.
Pattern 3: Count and sort
command | wc -l → Count lines of output.
-
Example:
ls -1 | wc -lto count files in directory- Explanation:
ls -1lists files (one per line)wc -lcounts the number of lines
- Sample output:
15(meaning 15 files)
- Explanation:
-
Example:
cat app.js | grep "function" | wc -lto count functions- Counts how many lines contain the word "function"
ls -1 src/content/blog/*.md | head -10 → List Markdown files in blog directory and show first 10.
- Explanation:
ls -1- lists files one per linesrc/content/blog/*.md- matches all.mdfiles in the blog directoryhead -10- displays only the first 10 results
- Use case: Quickly preview the first 10 blog post files in your content directory
- Sample output:
src/content/blog/2024-01-15-getting-started.md
src/content/blog/2024-02-03-react-hooks.md
src/content/blog/2024-02-18-typescript-tips.md
src/content/blog/2024-03-05-css-grid.md
src/content/blog/2024-03-22-node-apis.md
src/content/blog/2024-04-10-docker-basics.md
src/content/blog/2024-04-28-git-workflow.md
src/content/blog/2024-05-12-testing-jest.md
src/content/blog/2024-06-01-web-performance.md
src/content/blog/2024-06-20-security-best-practices.md
command | sort | uniq -c → Sort and count unique occurrences.
- Example:
cat access.log | cut -d' ' -f1 | sort | uniq -c | sort -rn- Explanation:
cat access.log- read log filecut -d' ' -f1- extract first field (IP addresses)sort- sort IPs alphabeticallyuniq -c- count unique IPssort -rn- sort by count (highest first)
- Use case: Find which IP addresses access your server most frequently
- Explanation:
Pattern 4: Search and transform
command | sed 's/old/new/g' → Transform output inline.
- Example:
cat urls.txt | sed 's/http:/https:/g'- Converts all HTTP URLs to HTTPS in the output (doesn't modify file)
command | awk '{print $column}' → Extract specific columns.
- Example:
ls -l | awk '{print $9}'to list only filenames- Explanation:
ls -l- long format listingawk '{print $9}'- print 9th column (filename)
- Explanation:
Pattern 5: Monitor and filter in real-time
tail -f file | grep pattern → Monitor log file for specific patterns.
-
Example:
tail -f server.log | grep ERROR- Explanation:
tail -f server.log- continuously read new lines from loggrep ERROR- show only lines containing "ERROR"
- Use case: Real-time error monitoring
- Explanation:
-
Example:
tail -f access.log | grep "404" | awk '{print $1, $7}'- Monitor 404 errors and show IP address and requested URL
Pattern 6: Process JSON data
command | jq '.field' → Parse and extract JSON fields.
-
Example:
cat package.json | jq '.dependencies'- Extracts only the dependencies object from package.json
-
Example:
npm list --json | jq '.dependencies | keys'- Lists all dependency names as an array
8.3 Complex Pipeline Examples
Example 1: Find top 10 largest files
find . -type f -exec ls -lh {} \; | awk '{print $5, $9}' | sort -hr | head -10
Explanation:
find . -type f- find all files-exec ls -lh {}- get detailed info for each fileawk '{print $5, $9}'- extract size and filenamesort -hr- sort by size (human-readable, reverse)head -10- show top 10
Example 2: Count lines of code by file type
find . -name "*.js" | xargs wc -l | sort -n | tail -20
Explanation:
find . -name "*.js"- find all JavaScript filesxargs wc -l- count lines in each filesort -n- sort numericallytail -20- show 20 largest files
Example 3: Extract and analyze TODO comments
grep -rn "TODO" src/ --exclude-dir=node_modules | cut -d':' -f1 | sort | uniq -c | sort -rn
Explanation:
grep -rn "TODO" src/- find all TODO comments with line numberscut -d':' -f1- extract just the filenamesort- sort filenamesuniq -c- count TODOs per filesort -rn- sort by count (most TODOs first)
Example 4: Batch file processing with pattern checking
for file in /home/user/project/backend/models/model_builder_*.py; do
basename "$file" | sed 's/model_builder_//;s/.py//';
done | while read model_name; do
if grep -q "created_date" "/home/user/project/backend/models/model_builder_${model_name}.py" 2>/dev/null; then
echo "$model_name: HAS created_date";
else
echo "$model_name: NO created_date";
fi;
done | sort
Explanation: This complex pipeline processes multiple model files, extracts their base names, checks for a specific field, and reports the results sorted alphabetically.
Step-by-step breakdown:
-
for file in /home/user/project/backend/models/model_builder_*.py; do- Loops through all Python files matching the pattern
model_builder_*.py - Example matches:
model_builder_user.py,model_builder_product.py,model_builder_order.py
- Loops through all Python files matching the pattern
-
basename "$file" | sed 's/model_builder_//;s/.py//'basename "$file"- extracts just the filename (removes directory path)- Example:
/home/user/project/backend/models/model_builder_user.py→model_builder_user.py
- Example:
sed 's/model_builder_//;s/.py//'- removes prefix and extension- First
s/model_builder_//removes "model_builder_" prefix - Second
s/.py//removes ".py" extension - Result:
model_builder_user.py→user
- First
-
done | while read model_name; do- Pipes the cleaned names to a while loop
- Each iteration,
model_namecontains the extracted name (e.g., "user", "product", "order")
-
if grep -q "created_date" "/home/user/project/backend/models/model_builder_${model_name}.py" 2>/dev/null; thengrep -q- quiet mode (no output, just exit code: 0 if found, 1 if not found)"created_date"- the pattern to search for"${model_name}.py"- reconstructs the full filename using the variable2>/dev/null- redirects errors to null (suppresses "file not found" errors)
-
echo "$model_name: HAS created_date"orecho "$model_name: NO created_date"- Prints whether the field was found in that model file
-
done | sort- Sorts all results alphabetically by model name
Sample output:
order: HAS created_date
product: HAS created_date
user: HAS created_date
vendor: NO created_date
warehouse: NO created_date
Use cases:
- Auditing which data models include specific fields (e.g., timestamps, audit fields)
- Checking API endpoint handlers for authentication middleware
- Verifying which components implement required interfaces
- Finding which configuration files include specific settings
Variations:
-
Check multiple patterns: Replace single
grep -qwith multiple conditions:if grep -q "created_date" "$file" && grep -q "updated_date" "$file"; then
echo "$model_name: HAS both timestamps"
fi -
Different file types: Adapt for other languages:
for file in src/components/component_*.tsx; do
basename "$file" | sed 's/component_//;s/.tsx//'
done | while read comp_name; do
if grep -q "useState" "src/components/component_${comp_name}.tsx" 2>/dev/null; then
echo "$comp_name: Uses React hooks"
else
echo "$comp_name: Class component"
fi
done | sort -
Count occurrences: Add counting logic:
grep -o "created_date" "$file" | wc -l
# Shows how many times the pattern appears in each file
Example 4: Clean and analyze CSV data
cat data.csv | tail -n +2 | cut -d',' -f2,3 | sort | uniq | wc -l
Explanation:
cat data.csv- read CSV filetail -n +2- skip header row (start from line 2)cut -d',' -f2,3- extract columns 2 and 3sort | uniq- get unique combinationswc -l- count unique entries
Example 5: Inspect file content with character-level detail
head -30 docs/public/instructions.md | od -c | head -40
Explanation: This pipeline extracts the first portion of a file and displays its raw character representation, useful for debugging encoding issues, hidden characters, or file format problems.
Step-by-step breakdown:
-
head -30 docs/public/instructions.md- Extracts first 30 lines from the markdown file
- Limits the amount of data to inspect (prevents overwhelming output)
-
| od -cod= octal dump (displays file contents in various formats)-c= character format (shows printable characters and escape sequences)- Displays special characters like:
\n= newline (line ending)\t= tab character\r= carriage return (Windows line endings)\0= null byte- Spaces shown as regular spaces
- Non-printable characters shown as octal codes
-
| head -40- Limits the octal dump output to first 40 lines
- Prevents terminal overflow when analyzing large files
Sample output:
0000000 # M a r k d o w n I n s t r
0000020 u c t i o n s \n \n # # W e l c
0000040 o m e \n \n T h i s g u i d e
0000060 s h o w s y o u h o w t
0000100 o u s e m a r k d o w n . \n
0000120 \n # # # H e a d i n g s \n \n Y
0000140 o u c a n c r e a t e h e
0000160 a d i n g s u s i n g ` # `
0000200 s y m b o l s . \n \n - # H
Understanding the output:
- Left column (e.g.,
0000000,0000020): Byte offset in octal - Middle columns: Characters in the file
- Regular characters shown as-is:
M,a,r,k - Spaces shown as spaces (harder to see, but present)
- Newlines shown as
\n - Tabs shown as
\t - Carriage returns shown as
\r(indicates Windows line endings)
- Regular characters shown as-is:
Use cases:
-
Detect Windows vs Unix line endings:
head -10 file.txt | od -c | grep -E "\\r\\n|\\n"
# If you see \r\n → Windows line endings (CRLF)
# If you see only \n → Unix line endings (LF) -
Find hidden characters causing parsing errors:
head -50 config.json | od -c | head -60
# Look for unexpected \r, \t, or non-printable characters -
Debug CSV file encoding issues:
head -5 data.csv | od -c | head -20
# Check for BOM (Byte Order Mark), unexpected delimiters, or encoding problems -
Verify file is plain text (not binary):
head -10 suspicious-file.txt | od -c | head -30
# If you see lots of octal codes (\000, \377), it's likely binary
Alternative od flags:
od -x→ Display in hexadecimal (useful for binary files)od -a→ Display as named characters (more readable than -c)od -An→ Remove address column (cleaner output)od -c -An→ Character format without addresses
Example comparing different line endings:
# Unix file (LF only)
$ echo -e "line1\nline2" | od -c
0000000 l i n e 1 \n l i n e 2 \n
# Windows file (CRLF)
$ echo -e "line1\r\nline2\r\n" | od -c
0000000 l i n e 1 \r \n l i n e 2 \r \n
Related commands:
hexdump -C file→ More modern alternative tood, shows hex and ASCII side-by-sidecat -A file→ Shows line endings and tabs (simpler, but less detailed thanod)file file.txt→ Identifies file type and encoding
8.4 Pipe Best Practices
Efficiency considerations:
- ✅
head -10 fileis more efficient thancat file | head -10 - ✅
grep pattern fileis more efficient thancat file | grep pattern - ✅ Use pipes when chaining multiple operations or when the command doesn't accept file arguments
When to use pipes:
- Combining multiple commands
- Filtering or transforming command output (not file content directly)
- Real-time monitoring with transformations
- Processing data that comes from commands (not files)
Common pitfalls:
- ❌
cat file | cat | cat- unnecessary pipes - ❌ Piping when a direct command exists
- ✅
command1 | command2 | command3- legitimate chaining
8.5 Other Redirection Operators
Output redirection:
>- Redirect output to file (overwrite)- Example:
echo "test" > file.txt
- Example:
>>- Redirect output to file (append)- Example:
echo "more" >> file.txt
- Example:
2>- Redirect error output- Example:
command 2> errors.log
- Example:
&>- Redirect both output and errors- Example:
command &> all-output.log
- Example:
Input redirection:
<- Read input from file- Example:
wc -l < file.txt
- Example:
Command substitution:
`command`or$(command)- Use command output as argument- Example:
echo "Today is $(date)"
- Example:
8.6 Conditional Execution with && and ||
Bash provides operators to chain commands based on the success or failure of previous commands.
Operators:
&&- AND operator: Run next command only if previous succeeded (exit code 0)||- OR operator: Run next command only if previous failed (exit code non-zero)
Basic syntax:
command1 && command2 # command2 runs only if command1 succeeds
command1 || command2 # command2 runs only if command1 fails
File Existence Check with Conditional Execution
[ -f file ] && echo "exists" || echo "not found" → Check if a file exists and print result.
[ -f file ]- Test if file exists and is a regular file&&- If test succeeds (file exists), runecho "exists"||- If previous command fails (file doesn't exist), runecho "not found"
Example: Check if .nojekyll file exists in public directory
[ -f public/.nojekyll ] && echo "exists" || echo "not found"
Explanation:
[ -f public/.nojekyll ]- Tests ifpublic/.nojekyllexists and is a regular file- If test returns true (exit code 0):
&&triggersecho "exists" - If test returns false (exit code 1):
||triggersecho "not found"
Sample outputs:
- If file exists:
exists - If file doesn't exist:
not found
Common file test operators:
[ -f file ]- True if file exists and is a regular file[ -d dir ]- True if directory exists[ -e path ]- True if path exists (file or directory)[ -r file ]- True if file is readable[ -w file ]- True if file is writable[ -x file ]- True if file is executable[ -s file ]- True if file exists and is not empty
More examples:
# Check if directory exists
[ -d node_modules ] && echo "node_modules found" || echo "run npm install"
# Check if script is executable
[ -x script.sh ] && echo "executable" || echo "not executable"
# Check if config file exists, create if not
[ -f .env ] && echo "Config exists" || echo "Creating .env" && touch .env
# Chain multiple conditions
[ -f package.json ] && [ -d src ] && echo "Valid project structure" || echo "Invalid structure"
Use in CI/CD and build scripts:
# Create .nojekyll for GitHub Pages if it doesn't exist
[ -f public/.nojekyll ] || touch public/.nojekyll
# Verify build output before deployment
[ -d dist ] && [ -f dist/index.html ] && echo "Build OK, deploying..." || exit 1
# Check multiple required files
[ -f package.json ] && [ -f tsconfig.json ] && npm run build || echo "Missing config files"
Why .nojekyll is important:
- GitHub Pages uses Jekyll by default, which ignores files/folders starting with
_ - Creating a
.nojekyllfile tells GitHub Pages to skip Jekyll processing - Essential for frameworks like Angular, React, or Next.js that use
_next,_app, etc. - Common in build scripts:
echo "" > public/.nojekyll
Alternative test syntax:
# Modern [[ ]] syntax (preferred in bash)
[[ -f file ]] && echo "exists" || echo "not found"
# Using test command (equivalent to [ ])
test -f file && echo "exists" || echo "not found"
# Explicit if-else (more verbose but clearer for complex logic)
if [ -f file ]; then
echo "exists"
else
echo "not found"
fi
Exit codes and conditions:
- Exit code 0 = Success/True →
&&executes - Exit code non-zero = Failure/False →
||executes - Commands like
grep,[ test ],command, etc. return exit codes - Check last exit code:
echo $?(0 = success, 1-255 = various failures)
9. File Permissions and Ownership
9.1 Change file permissions
chmod permissions file → Change file permissions.
-
Use numeric (755) or symbolic (u+x) notation.
-
Example:
chmod +x script.shto make a script executable.- Explanation: Adds execute permission for all users to
script.sh. - Sample output: (Permissions changed; no output)
- Explanation: Adds execute permission for all users to
-
Example:
chmod 644 config.jsonto set read/write for owner, read-only for others.- Explanation: Sets permissions to
-rw-r--r--(owner: rw, group: r, others: r).
- Explanation: Sets permissions to
9.2 Change file ownership
chown user:group file → Change file owner and group.
- Requires sudo/root for files you don't own.
- Example:
sudo chown www-data:www-data /var/www/html/index.html- Explanation: Changes ownership of the file to
www-datauser and group. - Sample output: (Ownership changed; no output)
- Explanation: Changes ownership of the file to
10. File Comparison and Differences
10.1 Compare two files
diff file1 file2 → Compare two files line by line.
-
Use
-ufor unified format (more readable),-yfor side-by-side comparison. -
Example:
diff config.old.json config.new.jsonto see what changed.- Sample output:
3c3
< "port": 3000
---
> "port": 8080
- Sample output:
-
Example:
diff -u old.js new.js > changes.patchto create a patch file.- Explanation: Creates a patch file that can be applied with the
patchcommand.
- Explanation: Creates a patch file that can be applied with the
10.2 Compare sorted files
comm file1 file2 → Compare two sorted files column by column.
- Shows lines unique to file1, unique to file2, and common to both.
- Example:
comm list1.txt list2.txtto compare sorted lists.- Explanation: Displays three columns: unique to file1, unique to file2, common.
- Sample output:
apple
banana
orange
11. File Compression and Archives
11.1 Compress with gzip
gzip file → Compress a file (creates file.gz).
-
Use
-dto decompress,-kto keep original file. -
Example:
gzip large-log.txtto compress a log file.- Explanation: Compresses
large-log.txttolarge-log.txt.gzand removes original.
- Explanation: Compresses
-
Example:
gzip -k backup.sqlto compress while keeping original.- Explanation: Creates
backup.sql.gzwhile preservingbackup.sql.
- Explanation: Creates
11.2 Decompress gzip files
gunzip file.gz → Decompress a gzip file.
- Example:
gunzip large-log.txt.gzto decompress.- Explanation: Decompresses
large-log.txt.gzback tolarge-log.txt.
- Explanation: Decompresses
11.3 Create tar archive
tar -czf archive.tar.gz directory/ → Create a compressed tar archive.
-cfor create,-zfor gzip,-ffor file,-vfor verbose (optional).- Example:
tar -czf project-backup.tar.gz src/to archive source directory.- Explanation: Compresses
src/into a gzipped tar archive.
- Explanation: Compresses
11.4 Extract tar archive
tar -xzf archive.tar.gz → Extract a compressed tar archive.
-
-xfor extract,-zfor gzip,-ffor file. -
Example:
tar -xzf project-backup.tar.gzto extract backup.- Explanation: Extracts all files from the gzipped tar archive.
-
Example:
tar -xzf archive.tar.gz -C /destination/path/to extract to specific location.- Explanation: Extracts archive contents to the specified directory.
11.5 Create zip archive
zip -r archive.zip directory/ → Create a zip archive.
- Example:
zip -r project.zip src/to archive source directory.- Sample output:
adding: src/app.js
adding: src/utils.js
- Sample output:
11.6 Extract zip archive
unzip archive.zip → Extract a zip archive.
- Use
-lto list contents without extracting,-dto specify destination. - Example:
unzip project.zipto extract all files.- Sample output:
extracting: src/app.js
extracting: src/utils.js
- Sample output:
11.7 Sync files with rsync
rsync -av source/ destination/ → Sync files between directories or machines.
-
-apreserves permissions,-vfor verbose,-zfor compression over network. -
Example:
rsync -avz /local/path/ user@remote:/backup/to backup to remote server.- Explanation: Efficiently copies only changed files to destination.
- Sample output:
sending incremental file list
app.js
sent 1.23K bytes received 45 bytes
-
Example:
rsync -av --delete build/ /var/www/html/to deploy website (mirror).- Explanation: Syncs files and removes any in destination not in source.
12. File Linking
12.1 Create symbolic link
ln -s target linkname → Create a symbolic link (symlink).
-
Example:
ln -s /usr/local/node-v20 /usr/local/nodeto create version symlink.- Explanation: Creates a symbolic link
nodethat points tonode-v20.
- Explanation: Creates a symbolic link
-
Example:
ln -s ../../shared/config.js ./config.jsto link shared config.- Explanation: Creates a relative symlink to a shared configuration file.
12.2 Create hard link
ln target linkname → Create a hard link.
- Hard links point to the same inode (data) as the original file.
- Example:
ln original.txt backup.txtto create a hard link.- Explanation: Both files reference the same data; changes to one affect both.
13. NPM Package Management
13.1 View package versions
npm view package versions --json → Retrieve all available versions of a package in JSON format.
- Example:
npm view astro-expressive-code versions --jsonto get version list.- Explanation: Fetches all published versions of the package from NPM.
- Sample output:
["0.1.0","0.2.0","1.0.0"]
13.2 View package details
npm view package --json → Get detailed information about a package.
- Example:
npm view astro-expressive-code --jsonto get package details.- Sample output:
{
"name": "astro-expressive-code",
"version": "1.0.0",
"description": "Expressive code for Astro",
"author": "user@example.com"
}
- Sample output:
13.3 Check for outdated packages
npm outdated → Check for outdated packages in your project.
- Sample output:
Package Current Wanted Latest Location
lodash 4.17.20 4.17.21 4.17.21 node_modules/lodash
13.4 List installed packages
npm list --depth=0 → Show installed packages (top-level only).
- Example:
npm list --depth=0to see direct dependencies.- Sample output:
project@1.0.0
├── express@4.18.2
└── lodash@4.17.21
- Sample output:
13.5 Security audit
npm audit → Check for security vulnerabilities.
- Sample output:
found 3 vulnerabilities (1 low, 2 moderate)
run `npm audit fix` to fix them
13.6 Clean install with npm ci
npm ci → Clean install dependencies (faster, stricter, for CI/CD pipelines). npm ci is the shorthand of npm clean-install.
What npm ci does:
- Deletes
node_modules/completely (if it exists) - Reads
package-lock.json(ornpm-shrinkwrap.json) - Installs exact versions specified in the lock file
- Never modifies
package.jsonorpackage-lock.json - Fails if dependencies don't match the lock file
Key differences: npm ci vs npm install:
| Feature | npm ci | npm install |
|---|---|---|
| Speed | Faster (up to 2x) | Slower |
| Lock file | Must exist, strictly followed | Optional, can be updated |
| node_modules | Always deleted first | Preserved, updated incrementally |
| package.json changes | Fails if mismatch with lock | Updates lock file to match |
| Version ranges | Installs exact versions only | Resolves ranges (^, ~, etc.) |
| Use case | CI/CD, production builds | Local development |
| Reproducibility | 100% reproducible | May vary based on timing |
When to use npm ci:
- ✅ Continuous Integration (CI/CD) pipelines
- ✅ Production deployments
- ✅ Docker builds
- ✅ Testing environments
- ✅ When you need guaranteed reproducible builds
- ✅ Fresh clone of repository
When to use npm install:
- ✅ Local development
- ✅ Adding new dependencies
- ✅ Updating existing dependencies
- ✅ Resolving version conflicts
How npm ci validates:
-
Compares
package.jsonwithpackage-lock.json:- Checks that all dependencies in
package.jsonare present in lock file - Verifies that versions in lock file satisfy the ranges in
package.json
- Checks that all dependencies in
-
Fails if mismatch detected:
npm ERR! `npm ci` can only install packages when your package.json and package-lock.json
npm ERR! or npm-shrinkwrap.json are in sync. Please update your lock file with `npm install`
npm ERR! before continuing. -
Example scenario causing failure:
package.jsonhas:"lodash": "^4.17.21"package-lock.jsonhas:"lodash": "4.17.20"- Result:
npm ciFAILS because 4.17.20 doesn't satisfy ^4.17.21 range - Solution: Run
npm installto update lock file, then commit it
Performance comparison:
# Fresh install with npm install
$ rm -rf node_modules
$ time npm install
real 0m45.234s
# Fresh install with npm ci
$ rm -rf node_modules
$ time npm ci
real 0m22.891s
Example GitHub Actions workflow:
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: '18'
- name: Install dependencies
run: npm ci # Always use npm ci in CI/CD
- name: Run tests
run: npm test
- name: Build
run: npm run build
Example Dockerfile:
FROM node:18-alpine
WORKDIR /app
# Copy package files
COPY package*.json ./
# Use npm ci for reproducible builds
RUN npm ci --only=production
# Copy application code
COPY . .
# Build and start
RUN npm run build
CMD ["npm", "start"]
Troubleshooting common npm ci errors:
-
Error: "package-lock.json not found"
- Solution: Run
npm installlocally to generate the lock file, then commit it
- Solution: Run
-
Error: "package.json and package-lock.json out of sync"
- Solution: Run
npm installto update lock file, review changes, commit
- Solution: Run
-
Error: "Invalid or corrupted package-lock.json"
- Solution: Delete lock file, run
npm install, commit new lock file
- Solution: Delete lock file, run
Best practices:
- ✅ Always commit
package-lock.jsonto version control - ✅ Use
npm ciin all automated environments (CI/CD, Docker) - ✅ Use
npm installonly for local development - ✅ Run
npm cibefore deploying to production - ✅ Never manually edit
package-lock.json
13.7 Execute package without installing
npx package-name → Execute package without installing globally.
- Example:
npx create-react-app my-appto use without global install.- Explanation: Downloads and runs package temporarily.
14. Git Version Control
14.1 Restore file to last commit
git checkout HEAD -- file → Restore a file to its last committed state.
- Example:
git checkout HEAD -- public/toggle-language.jsto reset a JavaScript file.- Explanation: Discards any uncommitted changes and restores it to the version from the last commit (HEAD).
14.2 View commit history
git log --oneline -10 → View the last 10 commits in a compact format.
- Sample output:
a1b2c3d Fix typo in README
e4f5g6h Add new feature
14.3 View short status
git status -s → Show short status of working directory.
- Sample output:
M src/app.js
?? newfile.txt
14.4 View changes
git diff → Show unstaged changes.
- Use
git diff --stagedfor staged changes. - Example:
git diff src/app.jsto see what changed.- Sample output:
-const port = 3000;
+const port = 8080;
- Sample output:
14.5 Visualize branch history
git log --graph --oneline --all → Visualize branch history.
- Sample output:
* a1b2c3d (HEAD -> main) Merge branch 'feature'
|\
| * e4f5g6h Add feature
|/
* h8i9j0k Initial commit
14.6 Temporarily save changes
git stash → Temporarily save uncommitted changes.
- Use
git stash popto restore changes,git stash listto view stashes. - Example:
git stashbefore switching branches.- Sample output:
Saved working directory and index state WIP on main
- Sample output:
14.7 List all branches
git branch -a → List all branches (local and remote).
- Sample output:
* main
feature-x
remotes/origin/main
remotes/origin/develop
14.8 Remove untracked files
git clean -fd → Remove untracked files and directories.
- Use
-nflag first to preview what will be deleted. - Example:
git clean -fdnto preview, thengit clean -fdto delete.- Sample output:
Removing temp/
Removing debug.log
- Sample output:
15. Process Management
15.1 Keyboard shortcuts for process control
Ctrl+C → Terminate the currently running process (sends SIGINT).
- Immediately stops the foreground process.
- Example: Press
Ctrl+Cwhile a server is running to stop it.- Explanation: Sends an interrupt signal (SIGINT) to the process, requesting graceful termination.
- Use case: Stop a running web server, script, or command that's taking too long.
- Sample output:
$ npm start
Server running on port 3000...
^C
$
Ctrl+Z → Suspend (pause) the currently running process.
- Pauses the process and puts it in the background (stopped state).
- The process is NOT terminated, just suspended.
- Example: Press
Ctrl+Zwhile editing a file to temporarily return to shell.- Explanation: Sends SIGTSTP signal, suspending the process and returning you to the command prompt.
- Use case: Temporarily pause a process to run other commands, then resume it later.
- Sample output:
$ vim myfile.txt
[Editing file...]
[Press Ctrl+Z]
[1]+ Stopped vim myfile.txt
$ - Note: Use
fgto resume in foreground orbgto resume in background.
15.2 List and filter processes
ps aux | grep node → List processes and filter for Node.js processes.
- Sample output:
user 1234 0.0 1.2 123456 7890 ? S 10:00 0:00 node server.js
15.3 Terminate process by PID
kill PID → Terminate a process by its process ID.
- Use
kill -9 PIDto force kill if process doesn't respond. - Example:
kill 1234to stop process with ID 1234.- Explanation: Sends SIGTERM signal to gracefully terminate the process.
15.4 Kill all processes by name
killall process_name → Kill all processes matching the name.
- Example:
killall nodeto stop all Node.js processes.- Explanation: Terminates all processes with "node" in their name.
15.5 Kill processes by pattern
pkill -f pattern → Kill processes matching a pattern.
- Example:
pkill -f "python.*server"to kill Python servers.- Explanation: Terminates processes where command line matches the regex pattern.
15.6 List background jobs
jobs → List background jobs in current shell.
- Shows all jobs started in the current terminal session.
- Each job has a job number (shown in brackets) and a status.
- Sample output:
[1]+ Running npm start &
[2]- Stopped vim file.txt - Explanation:
[1],[2]are job numbers+indicates current job (most recently started or stopped)-indicates previous job- Running: Job is executing in background
- Stopped: Job is suspended (paused with Ctrl+Z)
15.7 Resume job in background
bg → Resume the most recent stopped job in background.
bg %N → Resume job N in background.
-
Resumes a suspended job and runs it in the background.
-
Example:
bg %1to continue job 1 in background.- Explanation: Takes job 1 (which was stopped with Ctrl+Z) and resumes it in the background, allowing you to continue using the terminal.
- Workflow example:
$ npm start
[Press Ctrl+Z to suspend]
[1]+ Stopped npm start
$ bg %1
[1]+ npm start &
$ # Now the job runs in background, terminal is free
-
%1notation: Refers to job number 1 (fromjobscommand output) -
Alternative:
bgwithout arguments resumes the most recent stopped job
15.8 Bring job to foreground
fg → Bring the most recent background job to foreground.
fg %N → Bring job N to foreground.
-
Example:
fg %1to bring job 1 to foreground.- Explanation: Brings background or stopped job back to foreground where you can interact with it.
- Workflow example:
$ jobs
[1]+ Running npm start &
[2]- Stopped vim file.txt
$ fg %2
# vim returns to foreground, you can continue editing
-
%notation explanation:%1,%2, etc. are job specifiers%1= job number 1%%or%+= current job (marked with +)%-= previous job (marked with -)%?string= job whose command contains "string"
15.9 Run command immune to hangups
nohup command & → Run command immune to hangups, with output to nohup.out.
- Keeps process running even after you log out.
- Example:
nohup npm start &to run server that persists after logout.- Sample output:
nohup: ignoring input and appending output to 'nohup.out'
- Sample output:
16. File Cleanup and Disk Usage
16.1 Delete files by pattern
find . -type f -name "*.log" -delete → Find and delete all log files recursively.
- Alternative:
find . -type f -name "*.log" -exec rm {} \; - Note: The
-deleteoption is simpler and more efficient than-exec rm
16.2 Delete backup files
find . -name "*.bak" -o -name "*~" -delete → Delete backup and temporary files.
- Explanation: Finds and deletes common backup file patterns.
16.3 Find large files
find . -type f -size +100M → Find files larger than 100MB.
- Use
-size -100Mfor files smaller than 100MB. - Example:
find /var/log -type f -size +100Mto find large log files.- Sample output:
/var/log/syslog.1
/var/log/apache2/access.log
- Sample output:
16.4 Delete old files
find . -type f -mtime +30 -delete → Delete files older than 30 days.
- Use
-mtime -7for files modified in last 7 days. - Example:
find /tmp -type f -mtime +7 -deleteto clean old temp files.
16.5 Check disk usage
du -sh * → Show disk usage of files and directories.
-sfor summary,-hfor human-readable sizes.- Example:
du -sh * | sort -hto see what's taking up space.- Sample output:
4.0K README.md
12M node_modules
250M dist
- Sample output:
16.6 Check free disk space
df -h → Display free disk space on all mounted filesystems.
- Sample output:
Filesystem Size Used Avail Use% Mounted on
/dev/sda1 100G 45G 50G 48% /
17. Code Analysis and Documentation
17.1 Find TODO comments
grep -rn "TODO" . --exclude-dir=node_modules → Search for TODO comments in code, excluding dependencies.
- Sample output:
src/utils/helpers.js:15: // TODO: Optimize this function
17.2 Find all code annotations
grep -rn "FIXME\|TODO\|HACK\|XXX" src/ → Find all code annotations.
- Sample output:
src/app.js:23: // FIXME: Memory leak here
src/utils.js:45: // TODO: Add validation
17.3 Count lines of code
find . -name "*.js" -exec wc -l {} + | sort -n → Count lines in JS files, sorted.
- Sample output:
50 ./utils.js
120 ./app.js
340 ./main.js
510 total
18. Monitoring and Debugging
18.1 Run command repeatedly
watch -n 5 'npm run build' → Run a command repeatedly at intervals.
- Note: Requires
watchcommand (pre-installed on most Linux,brew install watchon macOS) - Explanation: Executes
npm run buildevery 5 seconds.
18.2 Monitor logs for errors
tail -f file | grep "ERROR" → Monitor log file for errors in real-time.
- Example:
tail -f server.log | grep "ERROR"to watch for errors.
18.3 Search command history
history | grep "command" → Search command history.
- Example:
history | grep "docker"to find Docker commands you've used.- Sample output:
342 docker ps
389 docker build -t myapp .
- Sample output:
18.4 Measure command execution time
time command → Measure how long a command takes to execute.
- Example:
time npm run buildto measure build time.- Sample output:
real 0m12.345s
user 0m10.123s
sys 0m1.234s
- Sample output:
18.5 Monitor system resources
top or htop → Monitor system resources in real-time.
- Press
qto quit. - Example:
htopfor better interactive monitoring.
18.6 Find process using a port
lsof -i tcp:PORT or lsof -i :PORT → List processes using a specific port.
-
Essential for troubleshooting
EADDRINUSE: address already in useerrors. -
Example:
lsof -i tcp:1668to find what's using port 1668.- Explanation: Shows which process is listening on the specified port, displaying PID, user, and command.
- Sample output:
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
node 44475 chen5 31u IPv4 0x8b1721168764e4bf 0t0 TCP *:strexec-s (LISTEN) - Note the PID: In this example, the PID is
44475
-
Example:
lsof -i :8080to find what's using port 8080.- Sample output:
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
node 1234 user 23u IPv4 0t0 TCP *:8080 (LISTEN)
- Sample output:
18.6a Kill process using a port
Kill a process by PID after finding it with lsof
- Step 1: Find the PID using
lsof -i tcp:PORT - Step 2: Kill the process using
kill -9 PID
Example workflow for EADDRINUSE error:
# Step 1: Find the process
$ lsof -i tcp:1668
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
node 44475 chen5 31u IPv4 0x8b1721168764e4bf 0t0 TCP *:strexec-s (LISTEN)
# Step 2: Kill the process
$ kill -9 44475
Explanation:
lsof -i tcp:1668identifies the process (PID 44475) using port 1668kill -9 44475forcefully terminates that process, freeing the port- Now you can restart your server without the address-in-use error
18.7 Show network connections
netstat -tuln → Show network connections and listening ports.
- Example:
netstat -tuln | grep LISTENto see listening ports.- Sample output:
tcp 0 0 0.0.0.0:80 0.0.0.0:* LISTEN
tcp 0 0 0.0.0.0:443 0.0.0.0:* LISTEN
- Sample output:
18.8 Test CORS configuration
curl -H "Origin: http://origin-url" --head http://target-url → Test CORS headers.
- Useful for verifying Cross-Origin Resource Sharing (CORS) is properly configured.
- Example:
curl -H "Origin: http://localhost:3000" --head http://localhost:5201/api/v1/your-name- Explanation: Sends a HEAD request with an Origin header to test if the server allows cross-origin requests from
http://localhost:3000. - Sample output:
HTTP/1.1 200 OK
X-Powered-By: Express
Access-Control-Allow-Origin: http://localhost:3000
Vary: Origin
server: Ts-Server
Author: Anand Raja
Content-Type: application/json; charset=utf-8
Content-Length: 48
ETag: W/"30-wCFITczWLjOV7yt7leOshObdFG4"
Date: Wed, 09 Jun 2021 15:05:10 GMT
Connection: keep-alive - What to look for: The
Access-Control-Allow-Origin: http://localhost:3000header confirms that your server allows requests from the specified origin.
- Explanation: Sends a HEAD request with an Origin header to test if the server allows cross-origin requests from
Understanding Access-Control-Allow-Origin values:
Access-Control-Allow-Origin: http://localhost:3000→ Allows only requests fromhttp://localhost:3000Access-Control-Allow-Origin: *→ Allows requests from ANY origin (all domains)- ⚠️ Security Warning: Using
*is convenient for development but should be avoided in production for sensitive APIs - Use specific origins in production for better security
- Note: When using credentials (cookies, auth headers), you CANNOT use
*
- ⚠️ Security Warning: Using
Example with wildcard (allow all origins):
$ curl -H "Origin: http://example.com" --head http://localhost:5201/api
HTTP/1.1 200 OK
Access-Control-Allow-Origin: *
- Note: You can also use online tools like https://cors-test.codehappy.dev/ to test CORS configuration.
19. JSON and Data Processing
19.1 Extract JSON field
jq '.field' file.json → Parse and display JSON data (requires jq installed).
- Example:
jq '.dependencies' package.json- Sample output:
{
"lodash": "^4.17.20",
"express": "^4.17.1"
}
- Sample output:
19.2 Extract raw JSON value
jq -r '.version' package.json → Extract raw value without quotes.
- Sample output:
1.0.0
19.3 Filter JSON arrays
jq '.[] | select(.age > 25)' users.json → Filter JSON arrays.
- Example: Filter users over 25 years old.
- Sample output:
{"name": "John", "age": 30}
{"name": "Jane", "age": 28}
- Sample output:
20. Directory Visualization
20.1 Display directory tree
tree -I node_modules → Display directory tree structure, excluding node_modules.
- Note: Requires
treecommand (apt install treeorbrew install tree) - Sample output:
.
├── src
│ ├── app.js
│ └── utils
│ └── helpers.js
├── package.json
└── README.md
20.2 Limit tree depth
tree -L 2 → Show directory tree up to 2 levels deep.
- Example:
tree -L 2 src/to avoid deep nesting.
20.3 Recursive listing
ls -R → Recursively list all files and subdirectories.
- Example:
ls -R src/to see all files in directory tree.
21. Build Commands
21.1 Run build with reduced output
npm run build --silent → Run the build script with minimal output.
- Note: Use
--silent(or-s) for suppressing npm logs, or--quietfor less verbose output - Alternative:
npm run build 2>&1 | grep -v "^>"to filter npm's own messages - Ideal for CI/CD pipelines or when you want to suppress verbose logs.
- Explanation: Executes the
buildscript frompackage.jsonwith reduced console output.
21.2 Run multiple commands concurrently
Using concurrently package to run frontend and backend simultaneously
- Installation:
npm install --save-dev concurrently - Setup: Add to
package.json:"scripts": {
"start": "ng serve",
"server": "ts-node-dev server/server.ts",
"dev": "concurrently \"npm start\" \"npm run server\""
}
Example for Angular + Node.js backend:
"scripts": {
"ng": "ng",
"start": "ng serve",
"build": "ng build",
"server": "concurrently \"ng serve\" \"ts-server/node_modules/.bin/ts-node-dev server/server.ts\""
}
Explanation:
concurrentlyruns multiple npm scripts simultaneously- Whenever you run
npm run server, both the backend and frontend spin up - Both services support live reloading when you make changes
- Output from both processes appears in the same terminal window
Benefits:
- No need for multiple terminal windows
- Automatic restart on file changes
- Simplified development workflow
- Single command to start entire stack
22. Troubleshooting Common Issues
22.1 Fixing "Address Already in Use" Error
Problem: EADDRINUSE: address already in use :::PORT
Solution:
Step 1: Find the process using the port
lsof -i tcp:1668
Output:
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
node 44475 chen5 31u IPv4 0x8b1721168764e4bf 0t0 TCP *:strexec-s (LISTEN)
Step 2: Kill the process
kill -9 44475
Alternative one-liner (use with caution):
lsof -ti tcp:1668 | xargs kill -9
- Explanation:
lsof -tireturns only the PID, which is piped tokill -9
22.2 Testing CORS Configuration
Problem: Need to verify CORS is working between frontend and backend
Solution:
Test with cURL:
curl -H "Origin: http://localhost:3000" --head http://localhost:5201/api/v1/your-name
Expected Response:
HTTP/1.1 200 OK
X-Powered-By: Express
Access-Control-Allow-Origin: http://localhost:3000
Vary: Origin
Content-Type: application/json; charset=utf-8
Date: Wed, 09 Jun 2021 15:05:10 GMT
Connection: keep-alive
What to check:
- ✅
Access-Control-Allow-Originheader should match your origin - ✅ Response status should be 200 OK
- ✅ No CORS errors in browser console
Understanding Access-Control-Allow-Origin values:
- Specific origin:
Access-Control-Allow-Origin: http://localhost:3000- Only allows requests from
http://localhost:3000 - Most secure option for production
- Only allows requests from
- Wildcard:
Access-Control-Allow-Origin: *- Allows requests from ANY origin (all domains)
- ⚠️ Security Warning: Convenient for development/public APIs but risky for sensitive data
- Important: Cannot be used with credentials (cookies, Authorization headers)
- Use specific origins in production when authentication is required
Online testing tool: https://cors-test.codehappy.dev/
Common CORS headers to verify:
Access-Control-Allow-Origin: Specifies allowed origins (*or specific domain)Access-Control-Allow-Methods: Allowed HTTP methods (GET, POST, PUT, DELETE, etc.)Access-Control-Allow-Headers: Allowed request headers (Content-Type, Authorization, etc.)Access-Control-Allow-Credentials: Whether credentials are allowed (true/false)
23. Network and DNS Troubleshooting (Windows)
23.1 Release IP address
ipconfig /release → Release the current IP address (Windows).
- Releases the current DHCP-assigned IP address for all network adapters.
- Use case: When experiencing network connectivity issues or before renewing IP.
- Example:
ipconfig /release- Explanation: Drops the current IP address, essentially disconnecting from the network temporarily.
- Sample output:
Windows IP Configuration
Ethernet adapter Ethernet:
Connection-specific DNS Suffix . :
- Note: Run as Administrator for full functionality.
- Linux equivalent:
sudo dhclient -r
23.2 Flush DNS cache
ipconfig /flushdns → Clear the DNS resolver cache (Windows).
-
Removes all cached DNS records from memory.
-
Use case: When websites aren't loading properly or DNS changes aren't taking effect.
-
Example:
ipconfig /flushdns- Explanation: Clears the local DNS cache, forcing Windows to request fresh DNS records from the DNS server.
- Sample output:
Windows IP Configuration
Successfully flushed the DNS Resolver Cache.
-
When to use:
- Website recently changed servers (new IP address)
- Can't access a website that others can access
- Experiencing DNS-related errors
- After removing malware that may have poisoned DNS cache
-
macOS equivalent:
sudo dscacheutil -flushcache; sudo killall -HUP mDNSResponder -
Linux equivalent:
sudo systemd-resolve --flush-caches(systemd) orsudo /etc/init.d/nscd restart(older systems)
23.3 Reset TCP/IP stack
netsh int ip reset all → Reset TCP/IP stack to default settings (Windows).
-
Resets the entire TCP/IP stack to its original installation state.
-
Use case: Fixing severe network issues, corrupted network settings, or persistent connectivity problems.
-
Example:
netsh int ip reset all- Explanation: Rewrites registry keys related to TCP/IP, resetting all network configurations.
- Sample output:
Resetting Global, OK!
Resetting Interface, OK!
Restart the computer to complete this action.
-
⚠️ Important Notes:
- Requires Administrator privileges
- Requires computer restart to take effect
- This will reset all network adapters to default settings
- You may need to reconfigure static IPs, DNS servers, and other custom network settings
- Creates a log file:
C:\Windows\setupapi.log
-
Alternative commands:
netsh winsock reset→ Reset Winsock catalognetsh int ipv4 reset→ Reset IPv4 settings onlynetsh int ipv6 reset→ Reset IPv6 settings only
23.4 Renew IP address
ipconfig /renew → Request a new IP address from DHCP server (Windows).
-
Requests a new IP address from the DHCP server for all network adapters.
-
Use case: After releasing IP or when connection is restored after network issues.
-
Example:
ipconfig /renew- Explanation: Contacts the DHCP server and requests a new IP address assignment.
- Sample output:
Windows IP Configuration
Ethernet adapter Ethernet:
Connection-specific DNS Suffix . : home
IPv4 Address. . . . . . . . . . . : 192.168.1.100
Subnet Mask . . . . . . . . . . . : 255.255.255.0
Default Gateway . . . . . . . . . : 192.168.1.1
-
Common workflow:
ipconfig /release
ipconfig /flushdns
ipconfig /renewThis sequence releases the old IP, clears DNS cache, and gets a fresh IP address.
-
Linux equivalent:
sudo dhclientorsudo dhcpcd
23.5 Complete network reset sequence (Windows)
Comprehensive network troubleshooting workflow:
# Step 1: Release current IP
ipconfig /release
# Step 2: Flush DNS cache
ipconfig /flushdns
# Step 3: Reset TCP/IP stack
netsh int ip reset all
# Step 4: Reset Winsock catalog
netsh winsock reset
# Step 5: Renew IP address
ipconfig /renew
# Step 6: Restart computer (required after reset commands)
shutdown /r /t 0
When to use this sequence:
- Internet connection problems after malware removal
- Cannot connect to any websites
- Network adapter showing "Limited connectivity"
- DNS errors persist after normal troubleshooting
- After changing ISP or router
- Unusual network behavior or frequent disconnections
⚠️ Warning: This completely resets network configuration. Document any custom settings (static IPs, custom DNS servers) before proceeding.
24. System Management and Shutdown Commands
While this guide focuses primarily on Unix/Linux commands, this section provides a comprehensive reference for system management operations across both Windows and Unix/Linux platforms, especially useful for developers working in mixed environments.
24.1 System Restart (Reboot)
Windows
shutdown /r /t 0 → Restart the computer immediately without any delay.
- Explanation: The
shutdowncommand initiates a system shutdown or restart./rspecifies a restart (reboot) instead of shutdown, and/t 0sets the timeout to 0 seconds, meaning immediate action. - What is /r?: The
/rflag tells Windows to restart the system after shutdown, rather than just powering off. - Sample output: (No output in command prompt; system restarts immediately)
Other restart options:
-
shutdown /r /t 60→ Restart after 60 seconds with a warning.- Explanation:
/tfollowed by seconds delays the action, giving users time to save their work. - Example:
shutdown /r /t 300→ Restart after 5 minutes (300 seconds)
- Explanation:
-
shutdown /r /f /t 0→ Force restart immediately, closing all applications.- Explanation:
/fforces running applications to close without saving prompts. - ⚠️ Warning: Unsaved work will be lost.
- Explanation:
Linux/macOS
sudo reboot → Restart the system immediately.
- Explanation: Simple and straightforward reboot command.
- Sample output: (System restarts immediately)
sudo shutdown -r now → Restart immediately.
- Explanation: Alternative syntax using shutdown command with restart flag.
- Sample output: (System restarts immediately)
Other restart options:
sudo shutdown -r +5→ Restart after 5 minutes.- Broadcasts a warning message to all logged-in users.
sudo shutdown -r 23:00→ Restart at 11:00 PM.- Schedule restart for a specific time.
sudo systemctl reboot→ Restart using systemd (modern Linux distributions).
24.2 System Shutdown (Power Off)
Windows
shutdown /s /t 0 → Shutdown immediately (power off).
- Explanation:
/sshuts down the system completely,/t 0means immediate action. - Sample output: (No output; system shuts down immediately)
Other shutdown options:
shutdown /s /t 60→ Shutdown after 60 seconds.- Gives users time to save work and close applications.
shutdown /s /f /t 0→ Force shutdown immediately.- Closes all applications without prompts.
Linux/macOS
sudo shutdown -h now → Shutdown immediately.
- Explanation:
-hmeans halt (shutdown and power off). - Sample output: (System shuts down immediately)
sudo poweroff → Shutdown the system.
- Explanation: Direct power off command.
Other shutdown options:
sudo shutdown -h +10→ Shutdown after 10 minutes.sudo shutdown -h 22:30→ Shutdown at 10:30 PM.sudo systemctl poweroff→ Shutdown using systemd.sudo halt→ Stop all CPU functions (halt the system).
24.3 Hibernate
Windows
shutdown /h → Hibernate the system.
- Explanation:
/hputs the computer into hibernation mode, saving the current state to disk and powering off. - Sample output: (System hibernates)
- Difference from sleep: Hibernation saves to disk and uses no power; sleep keeps state in RAM and uses minimal power.
Linux
sudo systemctl hibernate → Hibernate the system (requires swap partition).
- Note: Hibernation must be properly configured on Linux systems.
sudo pm-hibernate → Hibernate using pm-utils (older systems).
24.4 Log Off / Log Out
Windows
shutdown /l → Log off the current user.
- Explanation:
/llogs out the current user session without shutting down the computer. - Sample output: (User is logged out, login screen appears)
Linux/macOS
gnome-session-quit --logout --no-prompt → Log out (GNOME desktop).
pkill -KILL -u username → Force log out a specific user.
- ⚠️ Warning: This forcefully terminates all processes for the user.
exit or logout → Log out of current terminal session.
24.5 Abort Scheduled Shutdown
Windows
shutdown /a → Abort a scheduled shutdown.
- Explanation: Cancels a scheduled shutdown or restart, stopping it before it completes.
- Use case: If you initiated
shutdown /r /t 300but changed your mind. - Sample output:
The logoff is cancelled.
The scheduled shutdown has been cancelled.
Linux/macOS
sudo shutdown -c → Cancel a scheduled shutdown.
- Sample output:
Shutdown cancelled.
24.6 Force Close Applications
Windows
shutdown /r /f /t 0 → Restart and force close all applications.
- Explanation:
/fforces applications to close without save prompts. - ⚠️ Warning: Any unsaved work will be lost.
taskkill /F /IM process.exe → Force kill a process by name.
- Explanation: Equivalent to
kill -9in Linux./Fmeans force,/IMspecifies image name (process name). - Example:
taskkill /F /IM chrome.exeto force close Chrome. - Sample output:
SUCCESS: The process "chrome.exe" with PID 1234 has been terminated.
taskkill /F /PID 1234 → Force kill a process by PID.
- Example:
taskkill /F /PID 1234
Linux/macOS
kill -9 PID → Force kill a process by PID.
- Example:
kill -9 1234
killall -9 process_name → Force kill all processes by name.
- Example:
killall -9 chrometo kill all Chrome processes.
pkill -9 process_name → Force kill processes matching pattern.
- Example:
pkill -9 nodeto kill all Node.js processes.
24.7 Windows Services Management
Windows
net stop service_name → Stop a Windows service.
- Example:
net stop "Windows Update"to stop Windows Update service. - Sample output:
The Windows Update service is stopping.
The Windows Update service was stopped successfully.
net start service_name → Start a Windows service.
- Example:
net start "Windows Update"
sc query service_name → Query service status.
- Example:
sc query wuauservto check Windows Update service status.
sc stop service_name → Stop a service (alternative method).
sc start service_name → Start a service (alternative method).
Linux
sudo systemctl stop service_name → Stop a service.
- Example:
sudo systemctl stop nginxto stop Nginx web server. - Sample output:
Stopping nginx service...
sudo systemctl start service_name → Start a service.
- Example:
sudo systemctl start nginx
sudo systemctl restart service_name → Restart a service.
- Example:
sudo systemctl restart nginx
sudo systemctl status service_name → Check service status.
- Example:
sudo systemctl status nginx - Sample output:
● nginx.service - A high performance web server
Loaded: loaded (/lib/systemd/system/nginx.service; enabled)
Active: active (running) since Mon 2023-10-16 10:00:00 UTC
sudo systemctl enable service_name → Enable service to start on boot.
sudo systemctl disable service_name → Disable service from starting on boot.
Older Linux systems (using service command):
sudo service nginx stop→ Stop service.sudo service nginx start→ Start service.sudo service nginx restart→ Restart service.sudo service nginx status→ Check status.
24.8 System File and Disk Management
Windows
sfc /scannow → Scan and repair system files.
- Explanation: System File Checker scans all protected system files and replaces corrupted files with cached copies.
- Note: Must be run as Administrator.
- Example:
sfc /scannow - Sample output:
Beginning system scan. This process will take some time.
Beginning verification phase of system scan.
Verification 100% complete.
Windows Resource Protection found corrupt files and successfully repaired them. - Use case: Fix corrupted Windows system files after malware removal or system crashes.
chkdsk /f → Check disk for errors and fix them.
- Explanation: Checks the file system and file system metadata of a volume for logical and physical errors.
/ffixes errors on the disk. - Example:
chkdsk C: /fto check and fix C: drive. - Sample output:
The type of the file system is NTFS.
Chkdsk cannot run because the volume is in use by another process.
Would you like to schedule this volume to be checked the next time the system restarts? (Y/N) - Note: Usually requires restart if checking the system drive.
chkdsk /r → Check disk and recover readable information from bad sectors.
- Explanation:
/rlocates bad sectors and recovers readable information. Implies/f. - Example:
chkdsk D: /r
DISM /Online /Cleanup-Image /RestoreHealth → Repair Windows image.
- Explanation: Deployment Image Servicing and Management tool repairs Windows component store corruption.
- Use case: Run this before
sfc /scannowif system files are severely corrupted.
Linux
sudo fsck /dev/sda1 → Check and repair filesystem.
- Explanation: File System Consistency Check. Equivalent to
chkdskin Windows. - ⚠️ Warning: Filesystem must be unmounted or system must be in single-user mode.
- Example:
sudo fsck /dev/sda1to check partition sda1.
sudo fsck -y /dev/sda1 → Automatically repair without prompts.
- Explanation:
-yanswers "yes" to all prompts automatically.
sudo badblocks -v /dev/sda → Scan for bad blocks on disk.
- Explanation: Tests disk for bad sectors.
- Example:
sudo badblocks -v /dev/sda
sudo e2fsck -f /dev/sda1 → Force check ext2/ext3/ext4 filesystem.
- Explanation: Specifically for ext filesystems.
-fforces check even if filesystem appears clean.
sudo xfs_repair /dev/sda1 → Repair XFS filesystem.
- Explanation: For XFS filesystems specifically.
24.9 System Information
Windows
systeminfo → Display detailed system configuration information.
- Sample output:
Host Name: DESKTOP-ABC123
OS Name: Microsoft Windows 11 Pro
OS Version: 10.0.22000 Build 22000
System Manufacturer: Dell Inc.
System Model: XPS 15 9500
Processor(s): 1 Processor(s) Installed.
[01]: Intel64 Family 6 Model 165
Total Physical Memory: 16,384 MB
wmic cpu get name → Get CPU information.
wmic memorychip get capacity → Get RAM information.
Linux/macOS
uname -a → Display all system information.
- Sample output:
Linux hostname 5.15.0-58-generic #64-Ubuntu SMP x86_64 GNU/Linux
lsb_release -a → Display distribution information (Linux).
cat /proc/cpuinfo → Display CPU information.
free -h → Display memory usage.
df -h → Display disk space usage.
lscpu → Display detailed CPU architecture information.
lspci → List all PCI devices.
lsusb → List all USB devices.
hostnamectl → Display system hostname and related information.
24.10 Quick Reference Table
| Action | Windows | Linux/macOS |
|---|---|---|
| Restart immediately | shutdown /r /t 0 | sudo reboot |
| Restart after delay | shutdown /r /t 300 | sudo shutdown -r +5 |
| Shutdown immediately | shutdown /s /t 0 | sudo shutdown -h now |
| Hibernate | shutdown /h | sudo systemctl hibernate |
| Log off | shutdown /l | logout or exit |
| Cancel shutdown | shutdown /a | sudo shutdown -c |
| Force kill process | taskkill /F /IM app.exe | kill -9 PID |
| Stop service | net stop service | sudo systemctl stop service |
| Start service | net start service | sudo systemctl start service |
| Check disk | chkdsk /f | sudo fsck /dev/sda1 |
| Repair system files | sfc /scannow | N/A (distro-specific) |
| System info | systeminfo | uname -a |
Quick Reference Summary
Most Common Commands by Category
File Operations: ls, cat, head, tail, cp, mv, rm, touch
Text Processing: grep, sed, awk, cut, tr, sort, uniq, wc
Pipes & Chaining: |, >, >>, <, 2>, &>
Search: find, grep -r, locate
Archives: tar, gzip, zip, unzip, rsync
Version Control: git status, git log, git diff, git stash
Package Management: npm install, npm outdated, npm audit, npx
Process Management: ps, kill, jobs, bg, fg, top, lsof, Ctrl+C, Ctrl+Z
Monitoring: tail -f, watch, time, df, du
Data Processing: jq, awk, sort | uniq -c
Network (Windows): ipconfig, netsh, curl
System Management: shutdown, reboot, systemctl, service, taskkill, kill
Tips and Best Practices
- Always preview destructive operations: Use
-nor--dry-runflags when available - Use aliases: Create shortcuts for frequently used commands in your
.bashrcor.zshrc - Pipe commands: Combine simple commands with
|for powerful workflows - Learn regex: Many commands (grep, sed, awk) are much more powerful with regex
- Use tab completion: Most shells support tab completion for commands and file paths
- Read man pages: Use
man commandto view detailed documentation - Be careful with
rm -rf: Double-check paths before executing destructive commands - Use version control: Commit changes before running bulk file operations
- Test on sample data: Try commands on a small subset before running on all files
- Chain commands safely: Use
&&to run next command only if previous succeeds
Total Commands Covered: 120+
Platform Coverage:
- Unix/Linux/macOS: 90+ commands
- Windows-specific: 30+ commands (with Unix/Linux equivalents noted)
- Cross-platform system management commands included
Practice these commands to master the command line and become more efficient in your daily development tasks!
