Skip to main content

Advanced Command line Reference Guide for Developers

· 70 min read
Anand Raja
Senior Software Engineer

Complete Command Line Reference Guide for Developers

Mastering advanced shell commands can streamline your workflow when managing projects, inspecting files, and handling builds. This guide groups related commands by functionality, providing usage examples and explanations.


Table of Contents

  1. File Listing and Filtering
  2. Viewing File Contents
  3. File Creation and Basic Manipulation
  4. File Copying, Moving, and Deletion
  5. Text Replacement and Editing
  6. Searching Files and Content
  7. Advanced Text Processing
  8. Understanding Pipes and Command Chaining
  9. File Permissions and Ownership
  10. File Comparison and Differences
  11. File Compression and Archives
  12. File Linking
  13. NPM Package Management
  14. Git Version Control
  15. Process Management
  16. File Cleanup and Disk Usage
  17. Code Analysis and Documentation
  18. Monitoring and Debugging
  19. JSON and Data Processing
  20. Directory Visualization
  21. Build Commands
  22. Troubleshooting Common Issues
  23. Network and DNS Troubleshooting (Windows)
  24. System Management and Shutdown Commands

1. File Listing and Filtering

1.1 List all files with filtering

ls -la | grep pattern → List all files (including hidden) in long format and filter for lines containing the pattern.

  • Useful for quickly locating configuration or content-related files in a directory.
  • Note: grep will match the pattern anywhere in the line - in filenames, permissions, dates, etc.

Basic example:

ls -la src/ | grep config
  • Explanation: Lists all files and directories in the src/ folder with detailed information (permissions, owner, size, date), then filters the output to show only lines that contain "config".
  • Sample output:
    -rwxrwxrwx 1 user user 2043 Oct  9 21:37 content.config.ts

Example showing both files and folders:

ls -la src/ | grep content
  • Sample output:
    drwxrwxrwx 1 user user  512 Oct 10 11:36 content
    -rwxrwxrwx 1 user user 4089 Oct 10 11:39 content.config.ts

1.2 Understanding ls -la output format

To determine if a matching line is a folder or a file, examine the very first character of the output line from ls -la:

  • d at the start → Directory (folder)

    • Example: drwxrwxrwx 1 user user 512 Oct 10 11:36 content
    • The d indicates this is a directory named "content"
  • - at the start → Regular file

    • Example: -rwxrwxrwx 1 user user 4089 Oct 10 11:39 content.config.ts
    • The - indicates this is a regular file named "content.config.ts"
  • Other first characters:

    • l → Symbolic link
    • b → Block device
    • c → Character device
    • p → Named pipe
    • s → Socket

Output format breakdown:

-rwxrwxrwx 1 user user 4089 Oct 10 11:39 content.config.ts
│└────┬───┘ │ │ │ │ │ └─ Filename
│ Permissions│ │ │ │ └─ Modification date/time
│ Number of links │ └─ File size (bytes)
│ Owner └─ Group owner
└─ File type

1.3 Filter for only directories (folders)

ls -la | grep '^d' → Show only directories.

  • The ^d pattern means "lines that begin with 'd'"
  • ^ is a regex anchor meaning "start of line"

Example: Filter for directories containing "content"

ls -la src/ | grep '^d' | grep 'content'

Explanation:

  1. ls -la src/ - List all items in long format
  2. grep '^d' - Filter for all lines that begin with the character d (directories only)
  3. The output of the first grep is then piped (|) to the second grep
  4. Second grep 'content' - Filters for lines that also contain the word "content"

Sample output:

drwxrwxrwx 1 user user  512 Oct 10 11:36 content
drwxrwxrwx 1 user user 256 Oct 12 09:15 content-backup

Alternative (more concise):

ls -la src/ | grep '^d.*content'
  • This uses a single grep with regex: ^d.*content means "start with 'd', followed by any characters, then 'content'"

1.4 Filter for only regular files

Method 1: Using grep '^-' → Show only regular files.

ls -la src/ | grep '^-' | grep 'content'

Explanation:

  1. ls -la src/ - List all items in long format
  2. grep '^-' - Filter for lines beginning with '-' (regular files only)
  3. grep 'content' - Further filter for lines containing "content"

Sample output:

-rwxrwxrwx 1 user user 4089 Oct 10 11:39 content.config.ts
-rwxrwxrwx 1 user user 2156 Oct 11 14:22 content.types.ts

Method 2: Using grep -v '^d' → Show everything EXCEPT directories.

ls -la src/ | grep -v '^d' | grep 'content'

Explanation:

  • grep -v '^d' - This command inverts the search (-v flag)
  • Shows all lines that do NOT start with 'd'
  • This will show regular files, symlinks, and other file types (but not directories)

Sample output:

-rwxrwxrwx 1 user user 4089 Oct 10 11:39 content.config.ts
-rwxrwxrwx 1 user user 2156 Oct 11 14:22 content.types.ts
lrwxrwxrwx 1 user user 15 Oct 12 10:00 content-link -> ../content

Comparison of methods:

  • grep '^-' → Only shows regular files (excludes symlinks, devices, etc.)
  • grep -v '^d' → Shows everything except directories (includes symlinks, devices, etc.)

1.5 Advanced filtering combinations

Find only executable files:

ls -la | grep '^-' | grep 'x'

Find directories modified today:

ls -la | grep '^d' | grep "$(date '+%b %d')"

Find files larger than 1KB (with size column):

ls -lah | grep '^-' | awk '$5 ~ /K|M|G/ {print $0}'

Count number of directories:

ls -la | grep '^d' | wc -l

Count number of files:

ls -la | grep '^-' | wc -l

2. Viewing File Contents

Inspect file contents without opening them in an editor.

2.1 Display entire file

cat file → Display entire file contents.

  • Example: cat package.json to view the entire file.
    • Explanation: Prints all contents of package.json to the terminal.
    • Sample output: (Entire file contents displayed)

2.2 View first lines of a file

head -n <number> file → Display the first n lines of a file.

  • Default is 10 lines if number not specified.
  • Ideal for previewing imports, schemas, or headers in code files.
  • Example: head -5 content.config.ts to see initial lines of a TypeScript config.
    • Explanation: Shows the first 5 lines of content.config.ts, helping to quickly check the file's starting content like imports or definitions.
    • Sample output:
      import {defineCollection, z} from 'astro:content';

      const blog = defineCollection({
      type: 'content',
      schema: z.object({

Using pipes with head:

cat file | head -n <number> → Pipe file contents to head command.

  • Example: cat src/content.config.ts | head -10 to view first 10 lines.

    • Explanation: Reads the entire file with cat and pipes (|) the output to head, which then displays only the first 10 lines.
    • Sample output:
      import {defineCollection, z} from 'astro:content';

      const blog = defineCollection({
      type: 'content',
      schema: z.object({
      title: z.string(),
      description: z.string().optional().nullable(),
      date: z.date(),
      tags: z.array(z.string()).or(z.string()).optional().nullable(),
      category: z.array(z.string()).or(z.string()).default('uncategorized').nullable(),
  • Note: head file is more efficient than cat file | head because head can read the file directly without loading the entire file into memory first. However, cat | head is useful when chaining multiple commands together.

2.3 View last lines of a file

tail -n <number> file → Display the last n lines of a file.

  • Useful for checking recent log entries or end of files.
  • Example: tail -20 error.log to see the most recent errors.
    • Explanation: Shows the last 20 lines of the log file.
    • Sample output: (Last 20 lines of error.log)

Using pipes with tail:

  • Example: cat package.json | tail -5 to view last 5 lines.
    • Explanation: Pipes file contents to tail which shows only the last 5 lines.

2.4 Follow file in real-time

tail -f file → Follow a file in real-time (watch for new additions).

  • Essential for monitoring log files as they're being written.
  • Press Ctrl+C to stop watching.
  • Example: tail -f /var/log/nginx/access.log to monitor web server access.
    • Explanation: Continuously displays new lines as they're added to the log file.
    • Sample output: (New log entries appear in real-time)

Using pipes with grep for filtered monitoring:

  • Example: tail -f server.log | grep ERROR to watch only error messages.
    • Explanation: Combines real-time file following with pattern matching to show only lines containing "ERROR".

2.5 View file with navigation

less file → View file with navigation (scrollable, searchable).

  • Use arrow keys or Page Up/Down to navigate, / to search, q to quit.
  • Better than cat for large files.
  • Example: less large-log.txt to browse through a big file.
    • Explanation: Opens file in a pager that allows scrolling and searching.
    • Sample output: (Interactive viewer opens)

2.6 View file page by page

more file → View file page by page (simpler than less).

  • Press Space for next page, q to quit.
  • Example: more documentation.txt to read documentation.
    • Explanation: Displays file one screen at a time.
    • Sample output: (File displayed page by page)

3. File Creation and Basic Manipulation

3.1 Create or overwrite file from stdin

cat > file → Create or overwrite a file with input from stdin (end with Ctrl+D).

  • Creates a new file or completely replaces existing file content.
  • Example: cat > newfile.txt then type content and press Ctrl+D.
    • Explanation: Redirects standard input to newfile.txt, allowing you to type multi-line text until Ctrl+D is pressed.
    • Sample workflow:
      $ cat > newfile.txt
      This is my first line.
      This is my second line.
      [Press Ctrl+D to save]
    • Sample output: (Creates newfile.txt with the entered content)

3.2 Append to file from stdin

cat >> file → Append content to an existing file from stdin.

  • Adds new content to the end without overwriting existing content.
  • Example: cat >> existing.txt then type content and press Ctrl+D.
    • Explanation: Appends standard input to existing.txt without replacing existing content.
    • Sample workflow:
      $ cat >> filename.txt
      This is my first line of sample text.
      This is the second line.
      [Press Ctrl+D to save]
    • Sample output: (Appends new lines to existing.txt)
    • Note: If filename.txt does not exist, it will be created first.

3.3 Create or overwrite file with single line

echo "text" > file → Create or overwrite a file with a single line of text.

  • Uses the output redirection operator (>) to write text to file.

  • Example: echo "Hello World" > hello.txt

    • Explanation: Writes "Hello World" to hello.txt, creating or overwriting the file.
    • Sample output: (Creates hello.txt with "Hello World")
  • Example: echo "This is the text to write." > filename.txt

    • Explanation: Uses echo to output the string and redirects that output to filename.txt, creating or overwriting the file with the specified text.

3.4 Append single line to file

echo "text" >> file → Append a line of text to a file.

  • Uses the append redirection operator (>>).

  • Example: echo "New entry" >> log.txt

    • Explanation: Appends "New entry" as a new line to log.txt.
    • Sample output: (Adds "New entry" to log.txt)
  • Example: echo "This text will be appended." >> filename.txt

    • Explanation: Appends the specified text to the end of filename.txt without overwriting existing content.

3.5 Append formatted text

printf "text\n" >> file → Append formatted text to a file (more control than echo).

  • Example: printf "Name: %s\nAge: %d\n" "John" 25 >> user.txt
    • Explanation: Formats the string with placeholders and appends it to user.txt.
    • Sample output: (Appends "Name: John\nAge: 25\n" to user.txt)

3.6 Create empty file

touch file → Create an empty file or update its timestamp.

  • Example: touch empty.log
    • Explanation: Creates empty.log if it doesn't exist, or updates its modification time.
    • Sample output: (Creates empty.log; no output)

3.7 Empty existing file

truncate -s 0 file → Empty an existing file (set size to 0).

  • Example: truncate -s 0 debug.log
    • Explanation: Sets the size of debug.log to 0, effectively clearing its contents.
    • Sample output: (Clears debug.log; no output)

4. File Copying, Moving, and Deletion

4.1 Copy files

cp source destination → Copy a file or directory.

  • Use -r flag for recursive copying of directories.

  • Use -p to preserve file attributes (permissions, timestamps).

  • Example: cp config.json config.backup.json to create a backup.

    • Explanation: Creates a copy of config.json named config.backup.json.
    • Sample output: (File is copied; no output unless an error)
  • Example: cp -r src/ backup/ to copy entire directory.

    • Explanation: Recursively copies the src/ directory and all its contents to backup/.
    • Sample output: (Directory copied; no output unless an error)

4.2 Copy without overwriting

cp -n source destination → Copy only if destination doesn't exist (no-clobber).

  • Example: cp -n template.html index.html
    • Explanation: Copies template.html to index.html only if index.html doesn't already exist.
    • Sample output: (File copied if destination doesn't exist; no output)

4.3 Move or rename files

mv source destination → Move files to a directory or rename a file.

  • Example: mv old-component.js new-component.js to rename.

    • Explanation: Renames old-component.js to new-component.js in the same directory.
    • Sample output: (File renamed; no output unless an error)
  • Example: mv alipay.svg paypal.svg wechat.svg images/ to move multiple files.

    • Explanation: Moves the listed image files into the images/ directory.
    • Sample output: (Files moved; no output unless an error)

4.4 Delete files

rm file → Remove/delete a file.

  • Use -f to force deletion without prompts.

  • Use -r or -R for recursive directory deletion.

  • Example: rm old-file.txt to delete a single file.

    • Explanation: Permanently deletes old-file.txt from the filesystem.
    • Sample output: (File deleted; no output unless an error)
  • Example: rm -rf temp/ to force delete a directory and all contents.

    • Explanation: Recursively deletes the temp/ directory and everything inside without confirmation prompts.
    • ⚠️ Warning: Use with extreme caution - this cannot be undone!
    • Sample output: (Directory deleted; no output)

4.5 Delete with confirmation

rm -i file → Interactive deletion with confirmation prompt.

  • Example: rm -i important.txt
    • Explanation: Prompts "rm: remove regular file 'important.txt'?" before deletion.
    • Sample output:
      rm: remove regular file 'important.txt'? y

5. Text Replacement and Editing

Perform in-place string substitutions.

5.1 Basic find and replace

sed -i 's/old/new/g' file → Replace all occurrences of 'old' with 'new' in the file.

  • Note: On macOS, use sed -i '' 's/old/new/g' file (with empty string after -i)
  • The g flag means "global" (all occurrences on each line); without it, only first occurrence per line is replaced.
  • Use different delimiters (like | or #) when pattern contains slashes.

Understanding sed syntax:

  • s/old/new/g breakdown:
    • s = substitute command
    • /old/ = pattern to find (NO SPACE needed before 'old')
    • /new/ = replacement text (NO SPACE needed before 'new')
    • /g = global flag (replace all occurrences)
  • Spaces: Spaces are NOT needed between /old and /new. The / acts as the delimiter.
  • Example with spaces: sed -i 's/old text/new text/g' file - here spaces ARE part of the search/replace strings themselves

Basic examples:

  • Example 1: sed -i 's|/spinner\.gif|/images/spinner.gif|g' blog/display-pictures.md to update image paths.

    • Explanation: Replaces every instance of /spinner.gif with /images/spinner.gif. Note the | delimiter used instead of / to avoid escaping slashes in the path.
    • Sample output: (File is modified in-place; no output unless an error)
  • Example 2: sed -i 's/const /let /g' src/app.js to replace const with let.

    • Explanation: Changes all const declarations to let throughout the file. Note the space after const and let is part of the pattern.
    • Sample output: (File modified in-place)
  • Example 3: sed -i 's/http:/https:/g' config.json to update protocol.

    • Explanation: Replaces all http: with https: in the configuration file.
    • Note: No backslash needed before : because it's not a special character in sed.

5.2 Delete lines matching pattern

sed -i '/pattern/d' file → Delete lines matching a pattern.

  • Example: sed -i '/console\.log/d' src/app.js to remove all console.log statements.
    • Explanation: Deletes every line containing console.log from the file. The . is escaped with \. because . is a special regex character.
    • Sample output: (Lines removed; no output)

5.3 Print specific line range

sed -n 'start,end p' file → Print specific line range from a file.

  • Example: sed -n '10,20p' large-file.txt to view lines 10-20.
    • Explanation: Displays only lines 10 through 20 from the file (useful for large files).
    • Sample output: (Lines 10-20 displayed)

Using with pipes:

  • Example: cat large-file.txt | sed -n '10,20p' - same result using pipe.

5.4 Case-insensitive replacement

sed -i 's/old/new/gI' file → Case-insensitive replacement.

  • The I flag makes the pattern matching case-insensitive.
  • Example: sed -i 's/todo/FIXME/gI' notes.txt to replace TODO, todo, ToDo, etc.
    • Explanation: Replaces "todo" in any case combination with "FIXME".
    • Sample output: (File modified; no output)

5.5 Perl-based replacement

perl -pi -e 's/old/new/g' file → Alternative to sed with better regex support.

  • Example: perl -pi -e 's/\bcolor\b/colour/g' *.txt to replace "color" with "colour" (word boundaries).
    • Explanation: Uses word boundaries (\b) to match only the whole word "color", not "colorful".
    • Sample output: (Files modified; no output)

5.6 Viewing files with special characters (cat -A)

cat -A file → Display file contents with special characters visible (tabs, spaces, line endings).

  • This flag shows hidden characters, which is useful for debugging file formatting issues.
  • Flags breakdown:
    • -A (equivalent to -vET) → Shows all non-printing characters:
      • $ = end of line (newline character)
      • ^I = tab character
      • ^M = carriage return (often from Windows line endings)
      • Spaces are shown as spaces (no special symbol)
    • -n → Print line numbers before each line (shows line count at end)
    • -b → Number only non-blank lines
    • -s → Suppress repeated blank lines (shows only one blank line)
    • -E → Display $ at end of each line only
    • -T → Display tab characters as ^I
    • -v → Display non-printing characters using ^ and M- notations

Understanding the output symbols:

  • $ = newline (line ending)
  • ^I = tab character
  • ^M = carriage return (Windows-style \r\n line endings)
  • Spaces remain as spaces (not shown with special symbols)

Examples:

  • Example 1: cat -A src/app/components/todo.component.ts | sed -n '141,144p' to view lines 141-144 with special characters.

    • Explanation: Displays lines 141-144 from the TypeScript file, showing tabs as ^I and line endings as $. Useful for detecting formatting inconsistencies.
    • Sample output:
      ^Iconstructor(private todoService: TodoService) {$
      ^I^Ithis.todoService.getTodos().subscribe(todos => {$
      ^I^I^Ithis.todos = todos;$
      ^I^I});$
      • Each ^I represents a tab character used for indentation
      • Each $ shows the line ending
      • This reveals that the file uses tabs for indentation
  • Example 2: cat -n src/app/app.module.ts to display file with line numbers.

    • Explanation: Shows the entire file with line numbers on the left, useful for referencing specific lines during debugging or code review.
    • Sample output:
           1  import { NgModule } from '@angular/core';
      2 import { BrowserModule } from '@angular/platform-browser';
      3
      4 @NgModule({
      5 declarations: [AppComponent],
      6 imports: [BrowserModule],
      7 })
      8 export class AppModule { }
  • Example 3: cat -A config.json to check for Windows line endings in JSON file.

    • Explanation: If you see ^M$ instead of just $, the file has Windows line endings (\r\n). This can cause issues in Unix-like systems.
    • Sample output with Windows line endings:
      {^M$
      ^I"name": "my-app",^M$
      ^I"version": "1.0.0"^M$
      }^M$
    • Sample output with Unix line endings (correct):
      {$
      ^I"name": "my-app",$
      ^I"version": "1.0.0"$
      }$
  • Example 4: cat -A src/styles.css | grep " " to find tab characters in CSS file.

    • Explanation: Combines cat -A to show tabs as ^I, then pipes to grep to find lines containing actual tab characters. Useful for enforcing consistent indentation (spaces vs tabs).
    • Sample output: (Lines containing tab characters, shown as ^I)
  • Example 5: cat -ns large-file.txt to display file with numbered non-blank lines and squeeze blank lines.

    • Explanation: Shows line numbers only for non-blank lines (-n -b combined) and suppresses multiple consecutive blank lines (-s). Useful for large files with many blank lines.
    • Sample output: (More compact display of large files)

Fixing line ending issues:

  • Convert Windows to Unix line endings: sed -i 's/\r$//' file.txt (removes carriage return)
  • Verify fix: cat -A file.txt | tail -5 (check last 5 lines for proper line endings)

6. Searching Files and Content

Locate files by name and search for specific strings within them.

6.1 Find files by name

find . -name "pattern" → Find files by name pattern.

  • . represents the current directory. It tells find to start searching from the current working directory and include all subdirectories.

    • You can replace . with any path like /home/user/project or ~/Documents to search from a different location.
    • Using .. would search from the parent directory.
    • Using / would search from the root directory (entire filesystem).
  • Use -type f for files only, -type d for directories only.

  • Use -iname for case-insensitive search.

  • Example: find . -name "*.json" to find all JSON files.

    • Explanation: Recursively searches for files ending in .json from current directory (. = current directory).
    • Sample output:
      ./package.json
      ./config/database.json
      ./src/settings.json
  • Example: find /var/log -name "*.log" -mtime -7 to find logs modified in last 7 days.

    • Explanation: Finds log files modified within the past week.
    • Sample output: (List of recently modified log files)

find . -type f | sort → Find all files in current directory and subdirectories, then sort them alphabetically.

  • Explanation: Searches for all files (not directories) recursively from the current location and pipes the output to sort for alphabetical ordering.
  • Sample output:
    ./README.md
    ./package.json
    ./src/App.js
    ./src/components/Header.js
    ./src/index.js
    ./src/utils/helpers.js

find . -type f -name "*.md" -o -name "*.js" -o -name "*.jsx" | sort → Find all Markdown, JavaScript, and JSX files, then sort them.

  • Explanation: Searches for files with extensions .md, .js, or .jsx (using -o for OR condition) and sorts the results alphabetically. Useful for listing specific file types in a project.
  • Sample output:
    ./README.md
    ./docs/guide.md
    ./src/App.js
    ./src/components/Button.jsx
    ./src/components/Header.js
    ./src/index.js

6.2 Find files containing pattern

find . -name "*.ext" | xargs grep -l "pattern" → Find files by extension and filter those containing a pattern.

  • xargs takes the list of files from find and passes them as arguments to grep. This is efficient for handling large numbers of files.
  • Example: find . \( -name "*.astro" -o -name "*.js" -o -name "*.ts" \) | xargs grep -l "import React" to find files importing React.
    • Sample output:
      src/pages/index.astro
      src/components/Header.js

grep -r "pattern" directory → Recursively search for a pattern in files within a directory.

  • Use -i for case-insensitive, -n for line numbers, -v for inverse match (lines NOT containing pattern).

  • Use -A n to show n lines after match, -B n for before, -C n for context (before and after).

  • Example: grep -r "load-mathjax.js" src/ to find script references.

    • Sample output:
      src/pages/blog.astro:    <script src="/load-mathjax.js"></script>
  • Example: grep -rn "TODO" src/ --exclude-dir=node_modules to find TODO comments with line numbers.

    • Sample output:
      src/app.js:42:// TODO: Refactor this function
      src/utils.js:18:// TODO: Add error handling

6.3a Search for multiple patterns with OR operator

grep -rn "pattern1\|pattern2\|pattern3" directory --include="*.ext" → Search recursively for multiple patterns using OR operator.

  • The \| (escaped pipe) acts as an OR operator in grep patterns.

  • Use --include to filter by file extension.

  • Example: grep -rn "ERROR_CODE_404\|ERROR_CODE_500\|ERROR_CODE_503" /var/log/application/handlers/ --include="*.log" to find specific error codes in log files.

    • Explanation:
      1. grep -rn - recursive search with line numbers
      2. "ERROR_CODE_404\|ERROR_CODE_500\|ERROR_CODE_503" - searches for any of these three error codes
      3. /var/log/application/handlers/ - searches in this directory
      4. --include="*.log" - only searches files ending in .log
    • Sample output:
      /var/log/application/handlers/error_handler.log:142:ERROR_CODE_404: Resource not found at /api/users/123
      /var/log/application/handlers/error_handler.log:287:ERROR_CODE_500: Internal server error in database connection
      /var/log/application/handlers/api_handler.log:56:ERROR_CODE_503: Service temporarily unavailable
      /var/log/application/handlers/request_handler.log:991:ERROR_CODE_404: Endpoint /api/products/xyz not found
  • Example: grep -rn "useState\|useEffect\|useContext" src/components/ --include="*.jsx" to find React hooks usage.

    • Explanation: Searches for any of the three common React hooks in JSX component files.
    • Sample output:
      src/components/Header.jsx:5:import { useState, useEffect } from 'react';
      src/components/Auth.jsx:12: const [user, setUser] = useState(null);
      src/components/Theme.jsx:8: const theme = useContext(ThemeContext);
  • Example: grep -rn "api_key\|secret\|password" config/ --include="*.py" to find potential security issues in configuration files.

    • Explanation: Searches for sensitive credential keywords that should not be hardcoded.
    • Use case: Security audit to ensure no credentials are committed to repository.

Alternative syntax (Extended regex):

  • grep -Ern "pattern1|pattern2|pattern3" directory --include="*.ext" → Same as above but using -E flag for extended regex (no need to escape |).
  • Example: grep -Ern "ERROR_CODE_404|ERROR_CODE_500|ERROR_CODE_503" /var/log/application/ --include="*.log"
    • Explanation: -E enables extended regular expressions, so you can use | directly without backslash.
    • Same results as the basic syntax, but cleaner pattern definition.

6.4 List files containing pattern

grep -l "pattern" files → List filenames that contain the pattern (not the matching lines).

  • Example: grep -l "import React" src/**/*.js to find files importing React.
    • Explanation: Lists only the filenames that contain "import React".
    • Sample output:
      src/App.js
      src/components/Header.js

6.5 Count matching lines

grep -c "pattern" file → Count how many lines match the pattern.

  • Example: grep -c "error" server.log to count error occurrences.
    • Explanation: Returns the number of lines containing "error".
    • Sample output:
      23

6.6 Count files containing pattern

grep -l "pattern" files | wc -l → Count how many files contain the pattern.

  • Combines grep -l (list matching files) with wc -l (count lines) to get total count.

  • Example: grep -l "created_date" src/models/model_builder_*.py | wc -l to count model files containing the "created_date" field.

    • Explanation:
      1. grep -l "created_date" - searches for files containing "created_date"
      2. src/models/model_builder_*.py - matches all Python files starting with "model_builder_"
      3. | wc -l - counts how many filenames were returned
    • Sample output:
      12
      • Meaning: 12 out of all model_builder files contain the "created_date" field
  • Example: grep -l "useState" src/components/*.jsx | wc -l to count React components using hooks.

    • Explanation: Counts how many JSX component files use the useState hook.
    • Sample output: 8 (8 components use useState)
  • Example: grep -l "TODO" src/**/*.ts | wc -l to count TypeScript files with TODO comments.

    • Explanation: Useful for tracking technical debt across your TypeScript codebase.
    • Sample output: 15 (15 TypeScript files contain TODO comments)

7. Advanced Text Processing

7.1 Extract columns with awk

awk '{print $column}' file → Extract and print specific columns from text.

  • Powerful for processing structured text data.

  • Example: ls -l | awk '{print $9}' to list only filenames.

    • Explanation: Takes the output of ls -l and prints only the 9th column (filename).
    • Sample output:
      config.json
      package.json
      README.md
  • Example: awk -F',' '{print $1,$3}' data.csv to extract columns 1 and 3 from CSV.

    • Explanation: Uses comma as field separator and prints first and third columns.
    • Sample output:
      John 25
      Jane 30

7.2 Extract fields with cut

cut -d'delimiter' -f field file → Extract specific fields from delimited text.

  • Example: cut -d':' -f1 /etc/passwd to extract usernames.

    • Explanation: Splits each line by : delimiter and extracts the first field.
    • Sample output:
      root
      daemon
      user
  • Example: echo "name,age,city" | cut -d',' -f2 to get the second field.

    • Explanation: Extracts "age" from the comma-separated string.
    • Sample output:
      age

7.3 Translate or delete characters

tr 'set1' 'set2' → Translate or delete characters from stdin.

  • Example: echo "hello" | tr 'a-z' 'A-Z' to convert to uppercase.

    • Explanation: Translates all lowercase letters to uppercase.
    • Sample output:
      HELLO
  • Example: cat file.txt | tr -d '\r' to remove carriage returns (Windows line endings).

    • Explanation: Deletes all \r characters, useful when converting Windows files to Unix format.
  • Example: echo "hello world" | tr -s ' ' to squeeze repeated spaces.

    • Explanation: Replaces multiple consecutive spaces with a single space.
    • Sample output:
      hello world

7.4 Sort lines

sort file → Sort lines of text alphabetically.

  • Use -n for numerical sort, -r for reverse order, -u for unique lines only.

  • Example: sort names.txt to sort a list alphabetically.

    • Sample output:
      Alice
      Bob
      Charlie
  • Example: ls -l | sort -k5 -n to sort files by size.

    • Explanation: Sorts ls -l output numerically by the 5th column (file size).

7.5 Remove duplicate lines

uniq file → Remove duplicate adjacent lines.

  • Usually used after sort to get truly unique lines.

  • Use -c to count occurrences, -d to show only duplicates.

  • Example: sort data.txt | uniq to get unique sorted lines.

    • Sample output:
      apple
      banana
      orange
  • Example: sort access.log | uniq -c | sort -rn to count and rank by frequency.

    • Explanation: Counts occurrences of each unique line and sorts by count (highest first).
    • Sample output:
      15 GET /api/users
      8 GET /api/posts
      3 POST /api/login

7.6 Count lines, words, characters

wc file → Count lines, words, and characters in a file.

  • Use -l for lines only, -w for words, -c for bytes, -m for characters.

  • Example: wc -l app.js to count lines of code.

    • Sample output:
      245 app.js
  • Example: wc -l src/app/components/viewer/viewer.component.ts to count lines in a specific component file.

    • Explanation: Displays the line count for the TypeScript component file at the specified path. Useful for tracking component size or complexity.
    • Sample output:
      387 src/app/components/viewer/viewer.component.ts
    • Use case: Check file size before refactoring, track code growth, or verify component complexity.
  • Example: find . -name "*.js" | xargs wc -l to count total lines in all JS files.

    • Sample output:
      245 ./app.js
      180 ./utils.js
      425 total

8. Understanding Pipes and Command Chaining

8.1 What is a Pipe?

The Pipe Operator | → Sends the output of one command as input to another command.

  • Allows you to chain multiple commands together
  • Each command processes the output from the previous command
  • Fundamental to Unix/Linux command-line philosophy: "do one thing well"

Basic syntax: command1 | command2 | command3

8.2 Common Pipe Usage Patterns

Pattern 1: View first/last lines of output

cat file | head -n → Display first n lines of a file.

  • Example: cat src/content.config.ts | head -10

    • Explanation:
      1. cat src/content.config.ts reads the entire file
      2. | pipes the output to the next command
      3. head -10 takes that output and shows only the first 10 lines
    • Sample output:
      import {defineCollection, z} from 'astro:content';

      const blog = defineCollection({
      type: 'content',
      schema: z.object({
      title: z.string(),
      description: z.string().optional().nullable(),
      date: z.date(),
      tags: z.array(z.string()).or(z.string()).optional().nullable(),
  • Direct vs Piped:

    • head -10 file - Direct (more efficient, reads only needed lines)
    • cat file | head -10 - Piped (reads entire file, useful for chaining)

command | tail -n → Display last n lines of output.

  • Example: cat package.json | tail -5
    • Shows the last 5 lines of package.json

Pattern 2: Filter output

command | grep pattern → Filter output to show only matching lines.

  • Example: ls -la | grep config

    • Explanation: Lists all files, then filters to show only lines containing "config"
    • Sample output:
      -rwxrwxrwx 1 user user 2043 Oct  9 21:37 content.config.ts
      -rwxrwxrwx 1 user user 1523 Oct 8 14:22 vite.config.js
  • Example: ps aux | grep node

    • Shows only Node.js processes from all running processes
  • Example: cat error.log | grep ERROR | head -20

    • Chained pipeline: get file → filter for errors → show first 20
  • Example: npm run start 2>&1 | grep -E "(Application bundle|compiled successfully|error)" | head -20

    • Explanation: Monitors development server startup and filters output for key information.
      1. npm run start - starts the development server
      2. 2>&1 - redirects stderr to stdout (captures both normal output and errors)
      3. grep -E "(pattern1|pattern2|pattern3)" - filters for bundle info, success messages, or errors using extended regex
      4. | head -20 - shows only first 20 matching lines to avoid output overflow
    • Use case: Quick verification that server started successfully, or catching build errors early without scrolling through verbose logs.
    • Sample output:
      ✓ Application bundle generation complete.
      ✓ Browser application bundle generation complete.
      ✓ compiled successfully.
      Local: http://localhost:4200/
      ** Angular Live Development Server is listening on localhost:4200
    • Variations:
      • React/Vite: npm start 2>&1 | grep -E "(compiled|Local:|Network:)" | head -15
      • Next.js: npm run dev 2>&1 | grep -E "(ready|started|compiled)" | head -10
      • Node.js/Express: npm start 2>&1 | grep -E "(listening|started|error)" | head -10

Pattern 3: Count and sort

command | wc -l → Count lines of output.

  • Example: ls -1 | wc -l to count files in directory

    • Explanation:
      1. ls -1 lists files (one per line)
      2. wc -l counts the number of lines
    • Sample output: 15 (meaning 15 files)
  • Example: cat app.js | grep "function" | wc -l to count functions

    • Counts how many lines contain the word "function"

ls -1 src/content/blog/*.md | head -10 → List Markdown files in blog directory and show first 10.

  • Explanation:
    1. ls -1 - lists files one per line
    2. src/content/blog/*.md - matches all .md files in the blog directory
    3. head -10 - displays only the first 10 results
  • Use case: Quickly preview the first 10 blog post files in your content directory
  • Sample output:
    src/content/blog/2024-01-15-getting-started.md
    src/content/blog/2024-02-03-react-hooks.md
    src/content/blog/2024-02-18-typescript-tips.md
    src/content/blog/2024-03-05-css-grid.md
    src/content/blog/2024-03-22-node-apis.md
    src/content/blog/2024-04-10-docker-basics.md
    src/content/blog/2024-04-28-git-workflow.md
    src/content/blog/2024-05-12-testing-jest.md
    src/content/blog/2024-06-01-web-performance.md
    src/content/blog/2024-06-20-security-best-practices.md

command | sort | uniq -c → Sort and count unique occurrences.

  • Example: cat access.log | cut -d' ' -f1 | sort | uniq -c | sort -rn
    • Explanation:
      1. cat access.log - read log file
      2. cut -d' ' -f1 - extract first field (IP addresses)
      3. sort - sort IPs alphabetically
      4. uniq -c - count unique IPs
      5. sort -rn - sort by count (highest first)
    • Use case: Find which IP addresses access your server most frequently

Pattern 4: Search and transform

command | sed 's/old/new/g' → Transform output inline.

  • Example: cat urls.txt | sed 's/http:/https:/g'
    • Converts all HTTP URLs to HTTPS in the output (doesn't modify file)

command | awk '{print $column}' → Extract specific columns.

  • Example: ls -l | awk '{print $9}' to list only filenames
    • Explanation:
      1. ls -l - long format listing
      2. awk '{print $9}' - print 9th column (filename)

Pattern 5: Monitor and filter in real-time

tail -f file | grep pattern → Monitor log file for specific patterns.

  • Example: tail -f server.log | grep ERROR

    • Explanation:
      1. tail -f server.log - continuously read new lines from log
      2. grep ERROR - show only lines containing "ERROR"
    • Use case: Real-time error monitoring
  • Example: tail -f access.log | grep "404" | awk '{print $1, $7}'

    • Monitor 404 errors and show IP address and requested URL

Pattern 6: Process JSON data

command | jq '.field' → Parse and extract JSON fields.

  • Example: cat package.json | jq '.dependencies'

    • Extracts only the dependencies object from package.json
  • Example: npm list --json | jq '.dependencies | keys'

    • Lists all dependency names as an array

8.3 Complex Pipeline Examples

Example 1: Find top 10 largest files

find . -type f -exec ls -lh {} \; | awk '{print $5, $9}' | sort -hr | head -10

Explanation:

  1. find . -type f - find all files
  2. -exec ls -lh {} - get detailed info for each file
  3. awk '{print $5, $9}' - extract size and filename
  4. sort -hr - sort by size (human-readable, reverse)
  5. head -10 - show top 10

Example 2: Count lines of code by file type

find . -name "*.js" | xargs wc -l | sort -n | tail -20

Explanation:

  1. find . -name "*.js" - find all JavaScript files
  2. xargs wc -l - count lines in each file
  3. sort -n - sort numerically
  4. tail -20 - show 20 largest files

Example 3: Extract and analyze TODO comments

grep -rn "TODO" src/ --exclude-dir=node_modules | cut -d':' -f1 | sort | uniq -c | sort -rn

Explanation:

  1. grep -rn "TODO" src/ - find all TODO comments with line numbers
  2. cut -d':' -f1 - extract just the filename
  3. sort - sort filenames
  4. uniq -c - count TODOs per file
  5. sort -rn - sort by count (most TODOs first)

Example 4: Batch file processing with pattern checking

for file in /home/user/project/backend/models/model_builder_*.py; do 
basename "$file" | sed 's/model_builder_//;s/.py//';
done | while read model_name; do
if grep -q "created_date" "/home/user/project/backend/models/model_builder_${model_name}.py" 2>/dev/null; then
echo "$model_name: HAS created_date";
else
echo "$model_name: NO created_date";
fi;
done | sort

Explanation: This complex pipeline processes multiple model files, extracts their base names, checks for a specific field, and reports the results sorted alphabetically.

Step-by-step breakdown:

  1. for file in /home/user/project/backend/models/model_builder_*.py; do

    • Loops through all Python files matching the pattern model_builder_*.py
    • Example matches: model_builder_user.py, model_builder_product.py, model_builder_order.py
  2. basename "$file" | sed 's/model_builder_//;s/.py//'

    • basename "$file" - extracts just the filename (removes directory path)
      • Example: /home/user/project/backend/models/model_builder_user.pymodel_builder_user.py
    • sed 's/model_builder_//;s/.py//' - removes prefix and extension
      • First s/model_builder_// removes "model_builder_" prefix
      • Second s/.py// removes ".py" extension
      • Result: model_builder_user.pyuser
  3. done | while read model_name; do

    • Pipes the cleaned names to a while loop
    • Each iteration, model_name contains the extracted name (e.g., "user", "product", "order")
  4. if grep -q "created_date" "/home/user/project/backend/models/model_builder_${model_name}.py" 2>/dev/null; then

    • grep -q - quiet mode (no output, just exit code: 0 if found, 1 if not found)
    • "created_date" - the pattern to search for
    • "${model_name}.py" - reconstructs the full filename using the variable
    • 2>/dev/null - redirects errors to null (suppresses "file not found" errors)
  5. echo "$model_name: HAS created_date" or echo "$model_name: NO created_date"

    • Prints whether the field was found in that model file
  6. done | sort

    • Sorts all results alphabetically by model name

Sample output:

order: HAS created_date
product: HAS created_date
user: HAS created_date
vendor: NO created_date
warehouse: NO created_date

Use cases:

  • Auditing which data models include specific fields (e.g., timestamps, audit fields)
  • Checking API endpoint handlers for authentication middleware
  • Verifying which components implement required interfaces
  • Finding which configuration files include specific settings

Variations:

  • Check multiple patterns: Replace single grep -q with multiple conditions:

    if grep -q "created_date" "$file" && grep -q "updated_date" "$file"; then
    echo "$model_name: HAS both timestamps"
    fi
  • Different file types: Adapt for other languages:

    for file in src/components/component_*.tsx; do
    basename "$file" | sed 's/component_//;s/.tsx//'
    done | while read comp_name; do
    if grep -q "useState" "src/components/component_${comp_name}.tsx" 2>/dev/null; then
    echo "$comp_name: Uses React hooks"
    else
    echo "$comp_name: Class component"
    fi
    done | sort
  • Count occurrences: Add counting logic:

    grep -o "created_date" "$file" | wc -l
    # Shows how many times the pattern appears in each file

Example 4: Clean and analyze CSV data

cat data.csv | tail -n +2 | cut -d',' -f2,3 | sort | uniq | wc -l

Explanation:

  1. cat data.csv - read CSV file
  2. tail -n +2 - skip header row (start from line 2)
  3. cut -d',' -f2,3 - extract columns 2 and 3
  4. sort | uniq - get unique combinations
  5. wc -l - count unique entries

Example 5: Inspect file content with character-level detail

head -30 docs/public/instructions.md | od -c | head -40

Explanation: This pipeline extracts the first portion of a file and displays its raw character representation, useful for debugging encoding issues, hidden characters, or file format problems.

Step-by-step breakdown:

  1. head -30 docs/public/instructions.md

    • Extracts first 30 lines from the markdown file
    • Limits the amount of data to inspect (prevents overwhelming output)
  2. | od -c

    • od = octal dump (displays file contents in various formats)
    • -c = character format (shows printable characters and escape sequences)
    • Displays special characters like:
      • \n = newline (line ending)
      • \t = tab character
      • \r = carriage return (Windows line endings)
      • \0 = null byte
      • Spaces shown as regular spaces
      • Non-printable characters shown as octal codes
  3. | head -40

    • Limits the octal dump output to first 40 lines
    • Prevents terminal overflow when analyzing large files

Sample output:

0000000   #       M   a   r   k   d   o   w   n       I   n   s   t   r
0000020 u c t i o n s \n \n # # W e l c
0000040 o m e \n \n T h i s g u i d e
0000060 s h o w s y o u h o w t
0000100 o u s e m a r k d o w n . \n
0000120 \n # # # H e a d i n g s \n \n Y
0000140 o u c a n c r e a t e h e
0000160 a d i n g s u s i n g ` # `
0000200 s y m b o l s . \n \n - # H

Understanding the output:

  • Left column (e.g., 0000000, 0000020): Byte offset in octal
  • Middle columns: Characters in the file
    • Regular characters shown as-is: M, a, r, k
    • Spaces shown as spaces (harder to see, but present)
    • Newlines shown as \n
    • Tabs shown as \t
    • Carriage returns shown as \r (indicates Windows line endings)

Use cases:

  1. Detect Windows vs Unix line endings:

    head -10 file.txt | od -c | grep -E "\\r\\n|\\n"
    # If you see \r\n → Windows line endings (CRLF)
    # If you see only \n → Unix line endings (LF)
  2. Find hidden characters causing parsing errors:

    head -50 config.json | od -c | head -60
    # Look for unexpected \r, \t, or non-printable characters
  3. Debug CSV file encoding issues:

    head -5 data.csv | od -c | head -20
    # Check for BOM (Byte Order Mark), unexpected delimiters, or encoding problems
  4. Verify file is plain text (not binary):

    head -10 suspicious-file.txt | od -c | head -30
    # If you see lots of octal codes (\000, \377), it's likely binary

Alternative od flags:

  • od -x → Display in hexadecimal (useful for binary files)
  • od -a → Display as named characters (more readable than -c)
  • od -An → Remove address column (cleaner output)
  • od -c -An → Character format without addresses

Example comparing different line endings:

# Unix file (LF only)
$ echo -e "line1\nline2" | od -c
0000000 l i n e 1 \n l i n e 2 \n

# Windows file (CRLF)
$ echo -e "line1\r\nline2\r\n" | od -c
0000000 l i n e 1 \r \n l i n e 2 \r \n

Related commands:

  • hexdump -C file → More modern alternative to od, shows hex and ASCII side-by-side
  • cat -A file → Shows line endings and tabs (simpler, but less detailed than od)
  • file file.txt → Identifies file type and encoding

8.4 Pipe Best Practices

Efficiency considerations:

  • head -10 file is more efficient than cat file | head -10
  • grep pattern file is more efficient than cat file | grep pattern
  • ✅ Use pipes when chaining multiple operations or when the command doesn't accept file arguments

When to use pipes:

  • Combining multiple commands
  • Filtering or transforming command output (not file content directly)
  • Real-time monitoring with transformations
  • Processing data that comes from commands (not files)

Common pitfalls:

  • cat file | cat | cat - unnecessary pipes
  • ❌ Piping when a direct command exists
  • command1 | command2 | command3 - legitimate chaining

8.5 Other Redirection Operators

Output redirection:

  • > - Redirect output to file (overwrite)
    • Example: echo "test" > file.txt
  • >> - Redirect output to file (append)
    • Example: echo "more" >> file.txt
  • 2> - Redirect error output
    • Example: command 2> errors.log
  • &> - Redirect both output and errors
    • Example: command &> all-output.log

Input redirection:

  • < - Read input from file
    • Example: wc -l < file.txt

Command substitution:

  • `command` or $(command) - Use command output as argument
    • Example: echo "Today is $(date)"

8.6 Conditional Execution with && and ||

Bash provides operators to chain commands based on the success or failure of previous commands.

Operators:

  • && - AND operator: Run next command only if previous succeeded (exit code 0)
  • || - OR operator: Run next command only if previous failed (exit code non-zero)

Basic syntax:

command1 && command2  # command2 runs only if command1 succeeds
command1 || command2 # command2 runs only if command1 fails

File Existence Check with Conditional Execution

[ -f file ] && echo "exists" || echo "not found" → Check if a file exists and print result.

  • [ -f file ] - Test if file exists and is a regular file
  • && - If test succeeds (file exists), run echo "exists"
  • || - If previous command fails (file doesn't exist), run echo "not found"

Example: Check if .nojekyll file exists in public directory

[ -f public/.nojekyll ] && echo "exists" || echo "not found"

Explanation:

  1. [ -f public/.nojekyll ] - Tests if public/.nojekyll exists and is a regular file
  2. If test returns true (exit code 0): && triggers echo "exists"
  3. If test returns false (exit code 1): || triggers echo "not found"

Sample outputs:

  • If file exists: exists
  • If file doesn't exist: not found

Common file test operators:

  • [ -f file ] - True if file exists and is a regular file
  • [ -d dir ] - True if directory exists
  • [ -e path ] - True if path exists (file or directory)
  • [ -r file ] - True if file is readable
  • [ -w file ] - True if file is writable
  • [ -x file ] - True if file is executable
  • [ -s file ] - True if file exists and is not empty

More examples:

# Check if directory exists
[ -d node_modules ] && echo "node_modules found" || echo "run npm install"

# Check if script is executable
[ -x script.sh ] && echo "executable" || echo "not executable"

# Check if config file exists, create if not
[ -f .env ] && echo "Config exists" || echo "Creating .env" && touch .env

# Chain multiple conditions
[ -f package.json ] && [ -d src ] && echo "Valid project structure" || echo "Invalid structure"

Use in CI/CD and build scripts:

# Create .nojekyll for GitHub Pages if it doesn't exist
[ -f public/.nojekyll ] || touch public/.nojekyll

# Verify build output before deployment
[ -d dist ] && [ -f dist/index.html ] && echo "Build OK, deploying..." || exit 1

# Check multiple required files
[ -f package.json ] && [ -f tsconfig.json ] && npm run build || echo "Missing config files"

Why .nojekyll is important:

  • GitHub Pages uses Jekyll by default, which ignores files/folders starting with _
  • Creating a .nojekyll file tells GitHub Pages to skip Jekyll processing
  • Essential for frameworks like Angular, React, or Next.js that use _next, _app, etc.
  • Common in build scripts: echo "" > public/.nojekyll

Alternative test syntax:

# Modern [[ ]] syntax (preferred in bash)
[[ -f file ]] && echo "exists" || echo "not found"

# Using test command (equivalent to [ ])
test -f file && echo "exists" || echo "not found"

# Explicit if-else (more verbose but clearer for complex logic)
if [ -f file ]; then
echo "exists"
else
echo "not found"
fi

Exit codes and conditions:

  • Exit code 0 = Success/True → && executes
  • Exit code non-zero = Failure/False → || executes
  • Commands like grep, [ test ], command, etc. return exit codes
  • Check last exit code: echo $? (0 = success, 1-255 = various failures)

9. File Permissions and Ownership

9.1 Change file permissions

chmod permissions file → Change file permissions.

  • Use numeric (755) or symbolic (u+x) notation.

  • Example: chmod +x script.sh to make a script executable.

    • Explanation: Adds execute permission for all users to script.sh.
    • Sample output: (Permissions changed; no output)
  • Example: chmod 644 config.json to set read/write for owner, read-only for others.

    • Explanation: Sets permissions to -rw-r--r-- (owner: rw, group: r, others: r).

9.2 Change file ownership

chown user:group file → Change file owner and group.

  • Requires sudo/root for files you don't own.
  • Example: sudo chown www-data:www-data /var/www/html/index.html
    • Explanation: Changes ownership of the file to www-data user and group.
    • Sample output: (Ownership changed; no output)

10. File Comparison and Differences

10.1 Compare two files

diff file1 file2 → Compare two files line by line.

  • Use -u for unified format (more readable), -y for side-by-side comparison.

  • Example: diff config.old.json config.new.json to see what changed.

    • Sample output:
      3c3
      < "port": 3000
      ---
      > "port": 8080
  • Example: diff -u old.js new.js > changes.patch to create a patch file.

    • Explanation: Creates a patch file that can be applied with the patch command.

10.2 Compare sorted files

comm file1 file2 → Compare two sorted files column by column.

  • Shows lines unique to file1, unique to file2, and common to both.
  • Example: comm list1.txt list2.txt to compare sorted lists.
    • Explanation: Displays three columns: unique to file1, unique to file2, common.
    • Sample output:
      apple
      banana
      orange

11. File Compression and Archives

11.1 Compress with gzip

gzip file → Compress a file (creates file.gz).

  • Use -d to decompress, -k to keep original file.

  • Example: gzip large-log.txt to compress a log file.

    • Explanation: Compresses large-log.txt to large-log.txt.gz and removes original.
  • Example: gzip -k backup.sql to compress while keeping original.

    • Explanation: Creates backup.sql.gz while preserving backup.sql.

11.2 Decompress gzip files

gunzip file.gz → Decompress a gzip file.

  • Example: gunzip large-log.txt.gz to decompress.
    • Explanation: Decompresses large-log.txt.gz back to large-log.txt.

11.3 Create tar archive

tar -czf archive.tar.gz directory/ → Create a compressed tar archive.

  • -c for create, -z for gzip, -f for file, -v for verbose (optional).
  • Example: tar -czf project-backup.tar.gz src/ to archive source directory.
    • Explanation: Compresses src/ into a gzipped tar archive.

11.4 Extract tar archive

tar -xzf archive.tar.gz → Extract a compressed tar archive.

  • -x for extract, -z for gzip, -f for file.

  • Example: tar -xzf project-backup.tar.gz to extract backup.

    • Explanation: Extracts all files from the gzipped tar archive.
  • Example: tar -xzf archive.tar.gz -C /destination/path/ to extract to specific location.

    • Explanation: Extracts archive contents to the specified directory.

11.5 Create zip archive

zip -r archive.zip directory/ → Create a zip archive.

  • Example: zip -r project.zip src/ to archive source directory.
    • Sample output:
      adding: src/app.js
      adding: src/utils.js

11.6 Extract zip archive

unzip archive.zip → Extract a zip archive.

  • Use -l to list contents without extracting, -d to specify destination.
  • Example: unzip project.zip to extract all files.
    • Sample output:
      extracting: src/app.js
      extracting: src/utils.js

11.7 Sync files with rsync

rsync -av source/ destination/ → Sync files between directories or machines.

  • -a preserves permissions, -v for verbose, -z for compression over network.

  • Example: rsync -avz /local/path/ user@remote:/backup/ to backup to remote server.

    • Explanation: Efficiently copies only changed files to destination.
    • Sample output:
      sending incremental file list
      app.js
      sent 1.23K bytes received 45 bytes
  • Example: rsync -av --delete build/ /var/www/html/ to deploy website (mirror).

    • Explanation: Syncs files and removes any in destination not in source.

12. File Linking

ln -s target linkname → Create a symbolic link (symlink).

  • Example: ln -s /usr/local/node-v20 /usr/local/node to create version symlink.

    • Explanation: Creates a symbolic link node that points to node-v20.
  • Example: ln -s ../../shared/config.js ./config.js to link shared config.

    • Explanation: Creates a relative symlink to a shared configuration file.

ln target linkname → Create a hard link.

  • Hard links point to the same inode (data) as the original file.
  • Example: ln original.txt backup.txt to create a hard link.
    • Explanation: Both files reference the same data; changes to one affect both.

13. NPM Package Management

13.1 View package versions

npm view package versions --json → Retrieve all available versions of a package in JSON format.

  • Example: npm view astro-expressive-code versions --json to get version list.
    • Explanation: Fetches all published versions of the package from NPM.
    • Sample output:
      ["0.1.0","0.2.0","1.0.0"]

13.2 View package details

npm view package --json → Get detailed information about a package.

  • Example: npm view astro-expressive-code --json to get package details.
    • Sample output:
      {
      "name": "astro-expressive-code",
      "version": "1.0.0",
      "description": "Expressive code for Astro",
      "author": "user@example.com"
      }

13.3 Check for outdated packages

npm outdated → Check for outdated packages in your project.

  • Sample output:
    Package      Current  Wanted  Latest  Location
    lodash 4.17.20 4.17.21 4.17.21 node_modules/lodash

13.4 List installed packages

npm list --depth=0 → Show installed packages (top-level only).

  • Example: npm list --depth=0 to see direct dependencies.
    • Sample output:
      project@1.0.0
      ├── express@4.18.2
      └── lodash@4.17.21

13.5 Security audit

npm audit → Check for security vulnerabilities.

  • Sample output:
    found 3 vulnerabilities (1 low, 2 moderate)
    run `npm audit fix` to fix them

13.6 Clean install with npm ci

npm ci → Clean install dependencies (faster, stricter, for CI/CD pipelines). npm ci is the shorthand of npm clean-install.

What npm ci does:

  1. Deletes node_modules/ completely (if it exists)
  2. Reads package-lock.json (or npm-shrinkwrap.json)
  3. Installs exact versions specified in the lock file
  4. Never modifies package.json or package-lock.json
  5. Fails if dependencies don't match the lock file

Key differences: npm ci vs npm install:

Featurenpm cinpm install
SpeedFaster (up to 2x)Slower
Lock fileMust exist, strictly followedOptional, can be updated
node_modulesAlways deleted firstPreserved, updated incrementally
package.json changesFails if mismatch with lockUpdates lock file to match
Version rangesInstalls exact versions onlyResolves ranges (^, ~, etc.)
Use caseCI/CD, production buildsLocal development
Reproducibility100% reproducibleMay vary based on timing

When to use npm ci:

  • ✅ Continuous Integration (CI/CD) pipelines
  • ✅ Production deployments
  • ✅ Docker builds
  • ✅ Testing environments
  • ✅ When you need guaranteed reproducible builds
  • ✅ Fresh clone of repository

When to use npm install:

  • ✅ Local development
  • ✅ Adding new dependencies
  • ✅ Updating existing dependencies
  • ✅ Resolving version conflicts

How npm ci validates:

  1. Compares package.json with package-lock.json:

    • Checks that all dependencies in package.json are present in lock file
    • Verifies that versions in lock file satisfy the ranges in package.json
  2. Fails if mismatch detected:

    npm ERR! `npm ci` can only install packages when your package.json and package-lock.json 
    npm ERR! or npm-shrinkwrap.json are in sync. Please update your lock file with `npm install`
    npm ERR! before continuing.
  3. Example scenario causing failure:

    • package.json has: "lodash": "^4.17.21"
    • package-lock.json has: "lodash": "4.17.20"
    • Result: npm ci FAILS because 4.17.20 doesn't satisfy ^4.17.21 range
    • Solution: Run npm install to update lock file, then commit it

Performance comparison:

# Fresh install with npm install
$ rm -rf node_modules
$ time npm install
real 0m45.234s

# Fresh install with npm ci
$ rm -rf node_modules
$ time npm ci
real 0m22.891s

Example GitHub Actions workflow:

steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: '18'
- name: Install dependencies
run: npm ci # Always use npm ci in CI/CD
- name: Run tests
run: npm test
- name: Build
run: npm run build

Example Dockerfile:

FROM node:18-alpine
WORKDIR /app

# Copy package files
COPY package*.json ./

# Use npm ci for reproducible builds
RUN npm ci --only=production

# Copy application code
COPY . .

# Build and start
RUN npm run build
CMD ["npm", "start"]

Troubleshooting common npm ci errors:

  • Error: "package-lock.json not found"

    • Solution: Run npm install locally to generate the lock file, then commit it
  • Error: "package.json and package-lock.json out of sync"

    • Solution: Run npm install to update lock file, review changes, commit
  • Error: "Invalid or corrupted package-lock.json"

    • Solution: Delete lock file, run npm install, commit new lock file

Best practices:

  • ✅ Always commit package-lock.json to version control
  • ✅ Use npm ci in all automated environments (CI/CD, Docker)
  • ✅ Use npm install only for local development
  • ✅ Run npm ci before deploying to production
  • ✅ Never manually edit package-lock.json

13.7 Execute package without installing

npx package-name → Execute package without installing globally.

  • Example: npx create-react-app my-app to use without global install.
    • Explanation: Downloads and runs package temporarily.

14. Git Version Control

14.1 Restore file to last commit

git checkout HEAD -- file → Restore a file to its last committed state.

  • Example: git checkout HEAD -- public/toggle-language.js to reset a JavaScript file.
    • Explanation: Discards any uncommitted changes and restores it to the version from the last commit (HEAD).

14.2 View commit history

git log --oneline -10 → View the last 10 commits in a compact format.

  • Sample output:
    a1b2c3d Fix typo in README
    e4f5g6h Add new feature

14.3 View short status

git status -s → Show short status of working directory.

  • Sample output:
     M src/app.js
    ?? newfile.txt

14.4 View changes

git diff → Show unstaged changes.

  • Use git diff --staged for staged changes.
  • Example: git diff src/app.js to see what changed.
    • Sample output:
      -const port = 3000;
      +const port = 8080;

14.5 Visualize branch history

git log --graph --oneline --all → Visualize branch history.

  • Sample output:
    * a1b2c3d (HEAD -> main) Merge branch 'feature'
    |\
    | * e4f5g6h Add feature
    |/
    * h8i9j0k Initial commit

14.6 Temporarily save changes

git stash → Temporarily save uncommitted changes.

  • Use git stash pop to restore changes, git stash list to view stashes.
  • Example: git stash before switching branches.
    • Sample output:
      Saved working directory and index state WIP on main

14.7 List all branches

git branch -a → List all branches (local and remote).

  • Sample output:
    * main
    feature-x
    remotes/origin/main
    remotes/origin/develop

14.8 Remove untracked files

git clean -fd → Remove untracked files and directories.

  • Use -n flag first to preview what will be deleted.
  • Example: git clean -fdn to preview, then git clean -fd to delete.
    • Sample output:
      Removing temp/
      Removing debug.log

15. Process Management

15.1 Keyboard shortcuts for process control

Ctrl+C → Terminate the currently running process (sends SIGINT).

  • Immediately stops the foreground process.
  • Example: Press Ctrl+C while a server is running to stop it.
    • Explanation: Sends an interrupt signal (SIGINT) to the process, requesting graceful termination.
    • Use case: Stop a running web server, script, or command that's taking too long.
    • Sample output:
      $ npm start
      Server running on port 3000...
      ^C
      $

Ctrl+Z → Suspend (pause) the currently running process.

  • Pauses the process and puts it in the background (stopped state).
  • The process is NOT terminated, just suspended.
  • Example: Press Ctrl+Z while editing a file to temporarily return to shell.
    • Explanation: Sends SIGTSTP signal, suspending the process and returning you to the command prompt.
    • Use case: Temporarily pause a process to run other commands, then resume it later.
    • Sample output:
      $ vim myfile.txt
      [Editing file...]
      [Press Ctrl+Z]
      [1]+ Stopped vim myfile.txt
      $
    • Note: Use fg to resume in foreground or bg to resume in background.

15.2 List and filter processes

ps aux | grep node → List processes and filter for Node.js processes.

  • Sample output:
    user  1234  0.0  1.2  123456  7890 ?  S    10:00   0:00 node server.js

15.3 Terminate process by PID

kill PID → Terminate a process by its process ID.

  • Use kill -9 PID to force kill if process doesn't respond.
  • Example: kill 1234 to stop process with ID 1234.
    • Explanation: Sends SIGTERM signal to gracefully terminate the process.

15.4 Kill all processes by name

killall process_name → Kill all processes matching the name.

  • Example: killall node to stop all Node.js processes.
    • Explanation: Terminates all processes with "node" in their name.

15.5 Kill processes by pattern

pkill -f pattern → Kill processes matching a pattern.

  • Example: pkill -f "python.*server" to kill Python servers.
    • Explanation: Terminates processes where command line matches the regex pattern.

15.6 List background jobs

jobs → List background jobs in current shell.

  • Shows all jobs started in the current terminal session.
  • Each job has a job number (shown in brackets) and a status.
  • Sample output:
    [1]+  Running    npm start &
    [2]- Stopped vim file.txt
  • Explanation:
    • [1], [2] are job numbers
    • + indicates current job (most recently started or stopped)
    • - indicates previous job
    • Running: Job is executing in background
    • Stopped: Job is suspended (paused with Ctrl+Z)

15.7 Resume job in background

bg → Resume the most recent stopped job in background.

bg %N → Resume job N in background.

  • Resumes a suspended job and runs it in the background.

  • Example: bg %1 to continue job 1 in background.

    • Explanation: Takes job 1 (which was stopped with Ctrl+Z) and resumes it in the background, allowing you to continue using the terminal.
    • Workflow example:
      $ npm start
      [Press Ctrl+Z to suspend]
      [1]+ Stopped npm start
      $ bg %1
      [1]+ npm start &
      $ # Now the job runs in background, terminal is free
  • %1 notation: Refers to job number 1 (from jobs command output)

  • Alternative: bg without arguments resumes the most recent stopped job

15.8 Bring job to foreground

fg → Bring the most recent background job to foreground.

fg %N → Bring job N to foreground.

  • Example: fg %1 to bring job 1 to foreground.

    • Explanation: Brings background or stopped job back to foreground where you can interact with it.
    • Workflow example:
      $ jobs
      [1]+ Running npm start &
      [2]- Stopped vim file.txt
      $ fg %2
      # vim returns to foreground, you can continue editing
  • % notation explanation:

    • %1, %2, etc. are job specifiers
    • %1 = job number 1
    • %% or %+ = current job (marked with +)
    • %- = previous job (marked with -)
    • %?string = job whose command contains "string"

15.9 Run command immune to hangups

nohup command & → Run command immune to hangups, with output to nohup.out.

  • Keeps process running even after you log out.
  • Example: nohup npm start & to run server that persists after logout.
    • Sample output:
      nohup: ignoring input and appending output to 'nohup.out'

16. File Cleanup and Disk Usage

16.1 Delete files by pattern

find . -type f -name "*.log" -delete → Find and delete all log files recursively.

  • Alternative: find . -type f -name "*.log" -exec rm {} \;
  • Note: The -delete option is simpler and more efficient than -exec rm

16.2 Delete backup files

find . -name "*.bak" -o -name "*~" -delete → Delete backup and temporary files.

  • Explanation: Finds and deletes common backup file patterns.

16.3 Find large files

find . -type f -size +100M → Find files larger than 100MB.

  • Use -size -100M for files smaller than 100MB.
  • Example: find /var/log -type f -size +100M to find large log files.
    • Sample output:
      /var/log/syslog.1
      /var/log/apache2/access.log

16.4 Delete old files

find . -type f -mtime +30 -delete → Delete files older than 30 days.

  • Use -mtime -7 for files modified in last 7 days.
  • Example: find /tmp -type f -mtime +7 -delete to clean old temp files.

16.5 Check disk usage

du -sh * → Show disk usage of files and directories.

  • -s for summary, -h for human-readable sizes.
  • Example: du -sh * | sort -h to see what's taking up space.
    • Sample output:
      4.0K    README.md
      12M node_modules
      250M dist

16.6 Check free disk space

df -h → Display free disk space on all mounted filesystems.

  • Sample output:
    Filesystem      Size  Used Avail Use% Mounted on
    /dev/sda1 100G 45G 50G 48% /

17. Code Analysis and Documentation

17.1 Find TODO comments

grep -rn "TODO" . --exclude-dir=node_modules → Search for TODO comments in code, excluding dependencies.

  • Sample output:
    src/utils/helpers.js:15:  // TODO: Optimize this function

17.2 Find all code annotations

grep -rn "FIXME\|TODO\|HACK\|XXX" src/ → Find all code annotations.

  • Sample output:
    src/app.js:23:  // FIXME: Memory leak here
    src/utils.js:45: // TODO: Add validation

17.3 Count lines of code

find . -name "*.js" -exec wc -l {} + | sort -n → Count lines in JS files, sorted.

  • Sample output:
    50 ./utils.js
    120 ./app.js
    340 ./main.js
    510 total

18. Monitoring and Debugging

18.1 Run command repeatedly

watch -n 5 'npm run build' → Run a command repeatedly at intervals.

  • Note: Requires watch command (pre-installed on most Linux, brew install watch on macOS)
  • Explanation: Executes npm run build every 5 seconds.

18.2 Monitor logs for errors

tail -f file | grep "ERROR" → Monitor log file for errors in real-time.

  • Example: tail -f server.log | grep "ERROR" to watch for errors.

18.3 Search command history

history | grep "command" → Search command history.

  • Example: history | grep "docker" to find Docker commands you've used.
    • Sample output:
      342  docker ps
      389 docker build -t myapp .

18.4 Measure command execution time

time command → Measure how long a command takes to execute.

  • Example: time npm run build to measure build time.
    • Sample output:
      real    0m12.345s
      user 0m10.123s
      sys 0m1.234s

18.5 Monitor system resources

top or htop → Monitor system resources in real-time.

  • Press q to quit.
  • Example: htop for better interactive monitoring.

18.6 Find process using a port

lsof -i tcp:PORT or lsof -i :PORT → List processes using a specific port.

  • Essential for troubleshooting EADDRINUSE: address already in use errors.

  • Example: lsof -i tcp:1668 to find what's using port 1668.

    • Explanation: Shows which process is listening on the specified port, displaying PID, user, and command.
    • Sample output:
      COMMAND PID   USER  FD  TYPE DEVICE                SIZE/OFF NODE NAME
      node 44475 chen5 31u IPv4 0x8b1721168764e4bf 0t0 TCP *:strexec-s (LISTEN)
    • Note the PID: In this example, the PID is 44475
  • Example: lsof -i :8080 to find what's using port 8080.

    • Sample output:
      COMMAND  PID  USER   FD   TYPE  DEVICE  SIZE/OFF  NODE NAME
      node 1234 user 23u IPv4 0t0 TCP *:8080 (LISTEN)

18.6a Kill process using a port

Kill a process by PID after finding it with lsof

  • Step 1: Find the PID using lsof -i tcp:PORT
  • Step 2: Kill the process using kill -9 PID

Example workflow for EADDRINUSE error:

# Step 1: Find the process
$ lsof -i tcp:1668
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
node 44475 chen5 31u IPv4 0x8b1721168764e4bf 0t0 TCP *:strexec-s (LISTEN)

# Step 2: Kill the process
$ kill -9 44475

Explanation:

  • lsof -i tcp:1668 identifies the process (PID 44475) using port 1668
  • kill -9 44475 forcefully terminates that process, freeing the port
  • Now you can restart your server without the address-in-use error

18.7 Show network connections

netstat -tuln → Show network connections and listening ports.

  • Example: netstat -tuln | grep LISTEN to see listening ports.
    • Sample output:
      tcp    0    0 0.0.0.0:80      0.0.0.0:*      LISTEN
      tcp 0 0 0.0.0.0:443 0.0.0.0:* LISTEN

18.8 Test CORS configuration

curl -H "Origin: http://origin-url" --head http://target-url → Test CORS headers.

  • Useful for verifying Cross-Origin Resource Sharing (CORS) is properly configured.
  • Example: curl -H "Origin: http://localhost:3000" --head http://localhost:5201/api/v1/your-name
    • Explanation: Sends a HEAD request with an Origin header to test if the server allows cross-origin requests from http://localhost:3000.
    • Sample output:
      HTTP/1.1 200 OK
      X-Powered-By: Express
      Access-Control-Allow-Origin: http://localhost:3000
      Vary: Origin
      server: Ts-Server
      Author: Anand Raja
      Content-Type: application/json; charset=utf-8
      Content-Length: 48
      ETag: W/"30-wCFITczWLjOV7yt7leOshObdFG4"
      Date: Wed, 09 Jun 2021 15:05:10 GMT
      Connection: keep-alive
    • What to look for: The Access-Control-Allow-Origin: http://localhost:3000 header confirms that your server allows requests from the specified origin.

Understanding Access-Control-Allow-Origin values:

  • Access-Control-Allow-Origin: http://localhost:3000 → Allows only requests from http://localhost:3000
  • Access-Control-Allow-Origin: * → Allows requests from ANY origin (all domains)
    • ⚠️ Security Warning: Using * is convenient for development but should be avoided in production for sensitive APIs
    • Use specific origins in production for better security
    • Note: When using credentials (cookies, auth headers), you CANNOT use *

Example with wildcard (allow all origins):

$ curl -H "Origin: http://example.com" --head http://localhost:5201/api
HTTP/1.1 200 OK
Access-Control-Allow-Origin: *

19. JSON and Data Processing

19.1 Extract JSON field

jq '.field' file.json → Parse and display JSON data (requires jq installed).

  • Example: jq '.dependencies' package.json
    • Sample output:
      {
      "lodash": "^4.17.20",
      "express": "^4.17.1"
      }

19.2 Extract raw JSON value

jq -r '.version' package.json → Extract raw value without quotes.

  • Sample output:
    1.0.0

19.3 Filter JSON arrays

jq '.[] | select(.age > 25)' users.json → Filter JSON arrays.

  • Example: Filter users over 25 years old.
    • Sample output:
      {"name": "John", "age": 30}
      {"name": "Jane", "age": 28}

20. Directory Visualization

20.1 Display directory tree

tree -I node_modules → Display directory tree structure, excluding node_modules.

  • Note: Requires tree command (apt install tree or brew install tree)
  • Sample output:
    .
    ├── src
    │ ├── app.js
    │ └── utils
    │ └── helpers.js
    ├── package.json
    └── README.md

20.2 Limit tree depth

tree -L 2 → Show directory tree up to 2 levels deep.

  • Example: tree -L 2 src/ to avoid deep nesting.

20.3 Recursive listing

ls -R → Recursively list all files and subdirectories.

  • Example: ls -R src/ to see all files in directory tree.

21. Build Commands

21.1 Run build with reduced output

npm run build --silent → Run the build script with minimal output.

  • Note: Use --silent (or -s) for suppressing npm logs, or --quiet for less verbose output
  • Alternative: npm run build 2>&1 | grep -v "^>" to filter npm's own messages
  • Ideal for CI/CD pipelines or when you want to suppress verbose logs.
  • Explanation: Executes the build script from package.json with reduced console output.

21.2 Run multiple commands concurrently

Using concurrently package to run frontend and backend simultaneously

  • Installation: npm install --save-dev concurrently
  • Setup: Add to package.json:
    "scripts": {
    "start": "ng serve",
    "server": "ts-node-dev server/server.ts",
    "dev": "concurrently \"npm start\" \"npm run server\""
    }

Example for Angular + Node.js backend:

"scripts": {
"ng": "ng",
"start": "ng serve",
"build": "ng build",
"server": "concurrently \"ng serve\" \"ts-server/node_modules/.bin/ts-node-dev server/server.ts\""
}

Explanation:

  • concurrently runs multiple npm scripts simultaneously
  • Whenever you run npm run server, both the backend and frontend spin up
  • Both services support live reloading when you make changes
  • Output from both processes appears in the same terminal window

Benefits:

  • No need for multiple terminal windows
  • Automatic restart on file changes
  • Simplified development workflow
  • Single command to start entire stack

22. Troubleshooting Common Issues

22.1 Fixing "Address Already in Use" Error

Problem: EADDRINUSE: address already in use :::PORT

Solution:

Step 1: Find the process using the port

lsof -i tcp:1668

Output:

COMMAND PID   USER  FD  TYPE DEVICE                SIZE/OFF NODE NAME
node 44475 chen5 31u IPv4 0x8b1721168764e4bf 0t0 TCP *:strexec-s (LISTEN)

Step 2: Kill the process

kill -9 44475

Alternative one-liner (use with caution):

lsof -ti tcp:1668 | xargs kill -9
  • Explanation: lsof -ti returns only the PID, which is piped to kill -9

22.2 Testing CORS Configuration

Problem: Need to verify CORS is working between frontend and backend

Solution:

Test with cURL:

curl -H "Origin: http://localhost:3000" --head http://localhost:5201/api/v1/your-name

Expected Response:

HTTP/1.1 200 OK
X-Powered-By: Express
Access-Control-Allow-Origin: http://localhost:3000
Vary: Origin
Content-Type: application/json; charset=utf-8
Date: Wed, 09 Jun 2021 15:05:10 GMT
Connection: keep-alive

What to check:

  • Access-Control-Allow-Origin header should match your origin
  • ✅ Response status should be 200 OK
  • ✅ No CORS errors in browser console

Understanding Access-Control-Allow-Origin values:

  • Specific origin: Access-Control-Allow-Origin: http://localhost:3000
    • Only allows requests from http://localhost:3000
    • Most secure option for production
  • Wildcard: Access-Control-Allow-Origin: *
    • Allows requests from ANY origin (all domains)
    • ⚠️ Security Warning: Convenient for development/public APIs but risky for sensitive data
    • Important: Cannot be used with credentials (cookies, Authorization headers)
    • Use specific origins in production when authentication is required

Online testing tool: https://cors-test.codehappy.dev/

Common CORS headers to verify:

  • Access-Control-Allow-Origin: Specifies allowed origins (* or specific domain)
  • Access-Control-Allow-Methods: Allowed HTTP methods (GET, POST, PUT, DELETE, etc.)
  • Access-Control-Allow-Headers: Allowed request headers (Content-Type, Authorization, etc.)
  • Access-Control-Allow-Credentials: Whether credentials are allowed (true/false)

23. Network and DNS Troubleshooting (Windows)

23.1 Release IP address

ipconfig /release → Release the current IP address (Windows).

  • Releases the current DHCP-assigned IP address for all network adapters.
  • Use case: When experiencing network connectivity issues or before renewing IP.
  • Example: ipconfig /release
    • Explanation: Drops the current IP address, essentially disconnecting from the network temporarily.
    • Sample output:
      Windows IP Configuration

      Ethernet adapter Ethernet:
      Connection-specific DNS Suffix . :
  • Note: Run as Administrator for full functionality.
  • Linux equivalent: sudo dhclient -r

23.2 Flush DNS cache

ipconfig /flushdns → Clear the DNS resolver cache (Windows).

  • Removes all cached DNS records from memory.

  • Use case: When websites aren't loading properly or DNS changes aren't taking effect.

  • Example: ipconfig /flushdns

    • Explanation: Clears the local DNS cache, forcing Windows to request fresh DNS records from the DNS server.
    • Sample output:
      Windows IP Configuration

      Successfully flushed the DNS Resolver Cache.
  • When to use:

    • Website recently changed servers (new IP address)
    • Can't access a website that others can access
    • Experiencing DNS-related errors
    • After removing malware that may have poisoned DNS cache
  • macOS equivalent: sudo dscacheutil -flushcache; sudo killall -HUP mDNSResponder

  • Linux equivalent: sudo systemd-resolve --flush-caches (systemd) or sudo /etc/init.d/nscd restart (older systems)

23.3 Reset TCP/IP stack

netsh int ip reset all → Reset TCP/IP stack to default settings (Windows).

  • Resets the entire TCP/IP stack to its original installation state.

  • Use case: Fixing severe network issues, corrupted network settings, or persistent connectivity problems.

  • Example: netsh int ip reset all

    • Explanation: Rewrites registry keys related to TCP/IP, resetting all network configurations.
    • Sample output:
      Resetting Global, OK!
      Resetting Interface, OK!
      Restart the computer to complete this action.
  • ⚠️ Important Notes:

    • Requires Administrator privileges
    • Requires computer restart to take effect
    • This will reset all network adapters to default settings
    • You may need to reconfigure static IPs, DNS servers, and other custom network settings
    • Creates a log file: C:\Windows\setupapi.log
  • Alternative commands:

    • netsh winsock reset → Reset Winsock catalog
    • netsh int ipv4 reset → Reset IPv4 settings only
    • netsh int ipv6 reset → Reset IPv6 settings only

23.4 Renew IP address

ipconfig /renew → Request a new IP address from DHCP server (Windows).

  • Requests a new IP address from the DHCP server for all network adapters.

  • Use case: After releasing IP or when connection is restored after network issues.

  • Example: ipconfig /renew

    • Explanation: Contacts the DHCP server and requests a new IP address assignment.
    • Sample output:
      Windows IP Configuration

      Ethernet adapter Ethernet:
      Connection-specific DNS Suffix . : home
      IPv4 Address. . . . . . . . . . . : 192.168.1.100
      Subnet Mask . . . . . . . . . . . : 255.255.255.0
      Default Gateway . . . . . . . . . : 192.168.1.1
  • Common workflow:

    ipconfig /release
    ipconfig /flushdns
    ipconfig /renew

    This sequence releases the old IP, clears DNS cache, and gets a fresh IP address.

  • Linux equivalent: sudo dhclient or sudo dhcpcd

23.5 Complete network reset sequence (Windows)

Comprehensive network troubleshooting workflow:

# Step 1: Release current IP
ipconfig /release

# Step 2: Flush DNS cache
ipconfig /flushdns

# Step 3: Reset TCP/IP stack
netsh int ip reset all

# Step 4: Reset Winsock catalog
netsh winsock reset

# Step 5: Renew IP address
ipconfig /renew

# Step 6: Restart computer (required after reset commands)
shutdown /r /t 0

When to use this sequence:

  • Internet connection problems after malware removal
  • Cannot connect to any websites
  • Network adapter showing "Limited connectivity"
  • DNS errors persist after normal troubleshooting
  • After changing ISP or router
  • Unusual network behavior or frequent disconnections

⚠️ Warning: This completely resets network configuration. Document any custom settings (static IPs, custom DNS servers) before proceeding.


24. System Management and Shutdown Commands

While this guide focuses primarily on Unix/Linux commands, this section provides a comprehensive reference for system management operations across both Windows and Unix/Linux platforms, especially useful for developers working in mixed environments.

24.1 System Restart (Reboot)

Windows

shutdown /r /t 0 → Restart the computer immediately without any delay.

  • Explanation: The shutdown command initiates a system shutdown or restart. /r specifies a restart (reboot) instead of shutdown, and /t 0 sets the timeout to 0 seconds, meaning immediate action.
  • What is /r?: The /r flag tells Windows to restart the system after shutdown, rather than just powering off.
  • Sample output: (No output in command prompt; system restarts immediately)

Other restart options:

  • shutdown /r /t 60 → Restart after 60 seconds with a warning.

    • Explanation: /t followed by seconds delays the action, giving users time to save their work.
    • Example: shutdown /r /t 300 → Restart after 5 minutes (300 seconds)
  • shutdown /r /f /t 0 → Force restart immediately, closing all applications.

    • Explanation: /f forces running applications to close without saving prompts.
    • ⚠️ Warning: Unsaved work will be lost.

Linux/macOS

sudo reboot → Restart the system immediately.

  • Explanation: Simple and straightforward reboot command.
  • Sample output: (System restarts immediately)

sudo shutdown -r now → Restart immediately.

  • Explanation: Alternative syntax using shutdown command with restart flag.
  • Sample output: (System restarts immediately)

Other restart options:

  • sudo shutdown -r +5 → Restart after 5 minutes.
    • Broadcasts a warning message to all logged-in users.
  • sudo shutdown -r 23:00 → Restart at 11:00 PM.
    • Schedule restart for a specific time.
  • sudo systemctl reboot → Restart using systemd (modern Linux distributions).

24.2 System Shutdown (Power Off)

Windows

shutdown /s /t 0 → Shutdown immediately (power off).

  • Explanation: /s shuts down the system completely, /t 0 means immediate action.
  • Sample output: (No output; system shuts down immediately)

Other shutdown options:

  • shutdown /s /t 60 → Shutdown after 60 seconds.
    • Gives users time to save work and close applications.
  • shutdown /s /f /t 0 → Force shutdown immediately.
    • Closes all applications without prompts.

Linux/macOS

sudo shutdown -h now → Shutdown immediately.

  • Explanation: -h means halt (shutdown and power off).
  • Sample output: (System shuts down immediately)

sudo poweroff → Shutdown the system.

  • Explanation: Direct power off command.

Other shutdown options:

  • sudo shutdown -h +10 → Shutdown after 10 minutes.
  • sudo shutdown -h 22:30 → Shutdown at 10:30 PM.
  • sudo systemctl poweroff → Shutdown using systemd.
  • sudo halt → Stop all CPU functions (halt the system).

24.3 Hibernate

Windows

shutdown /h → Hibernate the system.

  • Explanation: /h puts the computer into hibernation mode, saving the current state to disk and powering off.
  • Sample output: (System hibernates)
  • Difference from sleep: Hibernation saves to disk and uses no power; sleep keeps state in RAM and uses minimal power.

Linux

sudo systemctl hibernate → Hibernate the system (requires swap partition).

  • Note: Hibernation must be properly configured on Linux systems.

sudo pm-hibernate → Hibernate using pm-utils (older systems).

24.4 Log Off / Log Out

Windows

shutdown /l → Log off the current user.

  • Explanation: /l logs out the current user session without shutting down the computer.
  • Sample output: (User is logged out, login screen appears)

Linux/macOS

gnome-session-quit --logout --no-prompt → Log out (GNOME desktop).

pkill -KILL -u username → Force log out a specific user.

  • ⚠️ Warning: This forcefully terminates all processes for the user.

exit or logout → Log out of current terminal session.

24.5 Abort Scheduled Shutdown

Windows

shutdown /a → Abort a scheduled shutdown.

  • Explanation: Cancels a scheduled shutdown or restart, stopping it before it completes.
  • Use case: If you initiated shutdown /r /t 300 but changed your mind.
  • Sample output:
    The logoff is cancelled.
    The scheduled shutdown has been cancelled.

Linux/macOS

sudo shutdown -c → Cancel a scheduled shutdown.

  • Sample output:
    Shutdown cancelled.

24.6 Force Close Applications

Windows

shutdown /r /f /t 0 → Restart and force close all applications.

  • Explanation: /f forces applications to close without save prompts.
  • ⚠️ Warning: Any unsaved work will be lost.

taskkill /F /IM process.exe → Force kill a process by name.

  • Explanation: Equivalent to kill -9 in Linux. /F means force, /IM specifies image name (process name).
  • Example: taskkill /F /IM chrome.exe to force close Chrome.
  • Sample output:
    SUCCESS: The process "chrome.exe" with PID 1234 has been terminated.

taskkill /F /PID 1234 → Force kill a process by PID.

  • Example: taskkill /F /PID 1234

Linux/macOS

kill -9 PID → Force kill a process by PID.

  • Example: kill -9 1234

killall -9 process_name → Force kill all processes by name.

  • Example: killall -9 chrome to kill all Chrome processes.

pkill -9 process_name → Force kill processes matching pattern.

  • Example: pkill -9 node to kill all Node.js processes.

24.7 Windows Services Management

Windows

net stop service_name → Stop a Windows service.

  • Example: net stop "Windows Update" to stop Windows Update service.
  • Sample output:
    The Windows Update service is stopping.
    The Windows Update service was stopped successfully.

net start service_name → Start a Windows service.

  • Example: net start "Windows Update"

sc query service_name → Query service status.

  • Example: sc query wuauserv to check Windows Update service status.

sc stop service_name → Stop a service (alternative method).

sc start service_name → Start a service (alternative method).

Linux

sudo systemctl stop service_name → Stop a service.

  • Example: sudo systemctl stop nginx to stop Nginx web server.
  • Sample output:
    Stopping nginx service...

sudo systemctl start service_name → Start a service.

  • Example: sudo systemctl start nginx

sudo systemctl restart service_name → Restart a service.

  • Example: sudo systemctl restart nginx

sudo systemctl status service_name → Check service status.

  • Example: sudo systemctl status nginx
  • Sample output:
    ● nginx.service - A high performance web server
    Loaded: loaded (/lib/systemd/system/nginx.service; enabled)
    Active: active (running) since Mon 2023-10-16 10:00:00 UTC

sudo systemctl enable service_name → Enable service to start on boot.

sudo systemctl disable service_name → Disable service from starting on boot.

Older Linux systems (using service command):

  • sudo service nginx stop → Stop service.
  • sudo service nginx start → Start service.
  • sudo service nginx restart → Restart service.
  • sudo service nginx status → Check status.

24.8 System File and Disk Management

Windows

sfc /scannow → Scan and repair system files.

  • Explanation: System File Checker scans all protected system files and replaces corrupted files with cached copies.
  • Note: Must be run as Administrator.
  • Example: sfc /scannow
  • Sample output:
    Beginning system scan. This process will take some time.

    Beginning verification phase of system scan.
    Verification 100% complete.

    Windows Resource Protection found corrupt files and successfully repaired them.
  • Use case: Fix corrupted Windows system files after malware removal or system crashes.

chkdsk /f → Check disk for errors and fix them.

  • Explanation: Checks the file system and file system metadata of a volume for logical and physical errors. /f fixes errors on the disk.
  • Example: chkdsk C: /f to check and fix C: drive.
  • Sample output:
    The type of the file system is NTFS.

    Chkdsk cannot run because the volume is in use by another process.
    Would you like to schedule this volume to be checked the next time the system restarts? (Y/N)
  • Note: Usually requires restart if checking the system drive.

chkdsk /r → Check disk and recover readable information from bad sectors.

  • Explanation: /r locates bad sectors and recovers readable information. Implies /f.
  • Example: chkdsk D: /r

DISM /Online /Cleanup-Image /RestoreHealth → Repair Windows image.

  • Explanation: Deployment Image Servicing and Management tool repairs Windows component store corruption.
  • Use case: Run this before sfc /scannow if system files are severely corrupted.

Linux

sudo fsck /dev/sda1 → Check and repair filesystem.

  • Explanation: File System Consistency Check. Equivalent to chkdsk in Windows.
  • ⚠️ Warning: Filesystem must be unmounted or system must be in single-user mode.
  • Example: sudo fsck /dev/sda1 to check partition sda1.

sudo fsck -y /dev/sda1 → Automatically repair without prompts.

  • Explanation: -y answers "yes" to all prompts automatically.

sudo badblocks -v /dev/sda → Scan for bad blocks on disk.

  • Explanation: Tests disk for bad sectors.
  • Example: sudo badblocks -v /dev/sda

sudo e2fsck -f /dev/sda1 → Force check ext2/ext3/ext4 filesystem.

  • Explanation: Specifically for ext filesystems. -f forces check even if filesystem appears clean.

sudo xfs_repair /dev/sda1 → Repair XFS filesystem.

  • Explanation: For XFS filesystems specifically.

24.9 System Information

Windows

systeminfo → Display detailed system configuration information.

  • Sample output:
    Host Name:                 DESKTOP-ABC123
    OS Name: Microsoft Windows 11 Pro
    OS Version: 10.0.22000 Build 22000
    System Manufacturer: Dell Inc.
    System Model: XPS 15 9500
    Processor(s): 1 Processor(s) Installed.
    [01]: Intel64 Family 6 Model 165
    Total Physical Memory: 16,384 MB

wmic cpu get name → Get CPU information.

wmic memorychip get capacity → Get RAM information.

Linux/macOS

uname -a → Display all system information.

  • Sample output:
    Linux hostname 5.15.0-58-generic #64-Ubuntu SMP x86_64 GNU/Linux

lsb_release -a → Display distribution information (Linux).

cat /proc/cpuinfo → Display CPU information.

free -h → Display memory usage.

df -h → Display disk space usage.

lscpu → Display detailed CPU architecture information.

lspci → List all PCI devices.

lsusb → List all USB devices.

hostnamectl → Display system hostname and related information.

24.10 Quick Reference Table

ActionWindowsLinux/macOS
Restart immediatelyshutdown /r /t 0sudo reboot
Restart after delayshutdown /r /t 300sudo shutdown -r +5
Shutdown immediatelyshutdown /s /t 0sudo shutdown -h now
Hibernateshutdown /hsudo systemctl hibernate
Log offshutdown /llogout or exit
Cancel shutdownshutdown /asudo shutdown -c
Force kill processtaskkill /F /IM app.exekill -9 PID
Stop servicenet stop servicesudo systemctl stop service
Start servicenet start servicesudo systemctl start service
Check diskchkdsk /fsudo fsck /dev/sda1
Repair system filessfc /scannowN/A (distro-specific)
System infosysteminfouname -a

Quick Reference Summary

Most Common Commands by Category

File Operations: ls, cat, head, tail, cp, mv, rm, touch

Text Processing: grep, sed, awk, cut, tr, sort, uniq, wc

Pipes & Chaining: |, >, >>, <, 2>, &>

Search: find, grep -r, locate

Archives: tar, gzip, zip, unzip, rsync

Version Control: git status, git log, git diff, git stash

Package Management: npm install, npm outdated, npm audit, npx

Process Management: ps, kill, jobs, bg, fg, top, lsof, Ctrl+C, Ctrl+Z

Monitoring: tail -f, watch, time, df, du

Data Processing: jq, awk, sort | uniq -c

Network (Windows): ipconfig, netsh, curl

System Management: shutdown, reboot, systemctl, service, taskkill, kill


Tips and Best Practices

  1. Always preview destructive operations: Use -n or --dry-run flags when available
  2. Use aliases: Create shortcuts for frequently used commands in your .bashrc or .zshrc
  3. Pipe commands: Combine simple commands with | for powerful workflows
  4. Learn regex: Many commands (grep, sed, awk) are much more powerful with regex
  5. Use tab completion: Most shells support tab completion for commands and file paths
  6. Read man pages: Use man command to view detailed documentation
  7. Be careful with rm -rf: Double-check paths before executing destructive commands
  8. Use version control: Commit changes before running bulk file operations
  9. Test on sample data: Try commands on a small subset before running on all files
  10. Chain commands safely: Use && to run next command only if previous succeeds

Total Commands Covered: 120+

Platform Coverage:

  • Unix/Linux/macOS: 90+ commands
  • Windows-specific: 30+ commands (with Unix/Linux equivalents noted)
  • Cross-platform system management commands included

Practice these commands to master the command line and become more efficient in your daily development tasks!