TOP SQL Interview Queries
Interview Questions and Answers
Interview Preparation - Shell Scripting, Java, Autosys, SDLC
Table of Contents
Shell Scripts
AutoSys & Job Scheduling
Java Programming
Python Programs
Unix, Git, CI/CD, SDLC
Shell Scripts
1. Find Second Largest + Second Smallest and Add Them
#!/bin/bash
arr=(5 1 9 6 1 2 8)
sorted=($(printf "%s\n" "${arr[@]}" | sort -n | uniq))
second_smallest=${sorted[1]}
second_largest=${sorted[-2]}
sum=$((second_smallest...
ALTER Keyword usage
Use
Syntax
Example
Rename table
ALTER TABLE old_table_name RENAME TO new_table_name;
ALTER TABLE employees RENAME TO staff;
Add column
ALTER TABLE table_name ADD column_name datatype;
ALTER TABLE employees ADD salary DECIMAL(10,2);
Drop column
ALTER TABLE table_name DROP COLUMN...
SQL and Database Interview Questions and Answers
SQL and Database Interview Questions and Answers
SQL and Database Interview Questions and Answers
1. What is normalization?
Normalization is the process of organizing data in a database to reduce redundancy and improve data integrity.
2. Difference between TRUNCATE and DELETE
TRUNCATE removes all rows instantly without logging individual row...
Interview Questions and Answers
Interview Questions and Answers
All Interview Questions and Answers
Shell Script Questions
1. Find Second Largest and Second Smallest and Add
arr=(12 13 3 4 1 6 9 17 13)
unique_arr=($(echo "${arr[@]}" | tr ' ' '\n' | sort -n | uniq))
second_smallest=${unique_arr[1]}
second_largest=${unique_arr[-2]}
sum=$((second_smallest + second_largest))
echo "Sum: $sum"
...
Autosys Job Status Cheatsheet
AutoSys Job Status Cheat Sheet
body { font-family: Arial, sans-serif; margin: 20px; line-height: 1.6; }
table { width: 100%; border-collapse: collapse; margin: 20px 0; }
th, td { border: 1px solid #ddd; padding: 8px; text-align: left; }
th { background-color: #f4f4f4; }
code { background-color: #f2f2f2; padding: 2px 6px;...
PySpark Interview Preparation Guide
PySpark Interview Preparation Guide
PySpark Interview Preparation Guide
Day 1: PySpark Basics & Core Concepts
What is PySpark: Python API for Apache Spark used for large-scale data processing.
Spark Architecture: Consists of Driver, Executors, Cluster Manager.
RDD vs DataFrame vs Dataset: RDD is low-level, DataFrame is optimized...
PySpark Interview Question and Answers
PySpark Questions and Answers
Question
Answer
Explanation
Which method below is used to create a temporary view on DataFrame?
DataFrame.createOrReplaceTempView("View Name")
This method registers a DataFrame as a temporary table for SQL queries in Spark.
Which of the below Spark Core operations are wide transformations...
Subscribe to:
Posts
(
Atom
)