There are mainly 3 types of operations.
The stream filter() method provides the functionality to filter stream's elements on the basis of given lambda expression or predicate. The stream filter() method expects a lambda expression or predicate in argument and the lambda expression returns a boolean(true/false) value, based on boolean value filter method filters the elements.
//Importing required classes import java.util.*; import java.io.*; import java.util.stream.*; public class FilterExample { static void main (
String
args[]) {
Integer
[] numbers = { 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 }; //Filtering the even numbers from a collection List<
Integer
> evenList = Arrays.asList(numbers) .stream() //passing lambda expression in argument .filter(i -> i % 2 == 0) .collect(Collectors.toList()); System.out.println( "Filtered even numbers : "+ evenList ); } } Output: Filtered even numbers : [2, 4, 6, 8, 10, 12]
The stream map() method is an intermediate operation. This method transforms elements of a stream by applying a mapping function to each of the elements and returns a new stream. It accepts a Function mapper object as an argument. This function describes how each element in the original stream is transformed into an element in the new stream.
//Importing required classes import java.util.*; import java.io.*; import java.util.stream.*; public class StreamMapExample { public static void main (
String
args[]){
Integer
[] numbers = { 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 }; //Filtering the even numbers from a collection List<
Integer
> evenList = Arrays.asList(numbers).stream() //passing lambda expression in argument .filter(i -> i % 2 == 0) //multiplying by 2 for each element after filter .map(i -> i*2) .collect(Collectors.toList()); System.out.println("Filtered even numbers : "+ evenList); } } Output: Filtered even numbers : [4, 8, 12, 16, 20, 24]
Docker is a container management service. It is an open-source platforms used for developing, packaging, and deploying applications. It is a container management service and it was released in 2013.
You need to install the Docker engine on your device. The Docker engine allows to create and manage docker containers, docker images, docker hub, docker desktop etc.
Container is a logical packaging process and runnable instance of an images.
Docker Compose is used to run multiple containers as a single service. You need to create a YAML file and provide details in the YAML file. Suppose database image name with the port, username, password etc. Docker Compose allows you to define and manage multi-container applications in a single YAML file.
For Example of docker-compose.yml
version: '3.7'
services:
app:
image: openjdk:17-jdk-slim
ports:
- 8000:8000
redis:
image: redis
volumes:
- ./data:/data
ports:
- "6379:6379"
web:
image: tomcat
ports:
- '8082:8080'
db:
image: postgres:14.1-alpine
restart: always
environment:
- POSTGRES_USER=username
- POSTGRES_PASSWORD=*******
ports:
- '5432:5432'
- Create a docker-compose.yaml file in the root directory in your application.
- Open command prompt and go to the directory where you have created docker-compose.yaml file.
- Run command as docker compose up in the command prompt.
Redis stands for Remote Dictionary Server. It is an open source NoSQL database. It doesn’t have any tables, rows and columns. Also, it doesn’t allow statements like select statement, insert statements, update statements etc.
It stores objects using key/value and basically used to cache and quick-response in an application to speed or increase performance.
Various ways of Redis
In-Memory Database(IMDS) enables minimal response times. It provides responses in microsecond of times. It is the good choice for the application if an application handles the large numbers of traffic.
Advantages:
The cache provides data quickly with high latency to increase the data retrieval performance. Redis is the good choice for caching API calls, session states, and database queries etc.
Advantages:
When a request comes, the code of logic checks in the Redis cache for the desired data. If appropriate data is available in the redis cache, logic simply returns the data from redis cache.
If the requested data is not available/found in the redis cache, the logic falls back to the database or original data source to retrieve the required information to return required data. Subsequently, the fetched data is stored in the Redis cache.
EhCache is an open source and Java based library, it is used to cache data to increase the performance. It is lightweight and Flexible, we can easyly integrate in the application.
Messaging system is the process of exchanging messages between two or more sources, end-points, servers etc. There are two processes of messaging system.
The stream.collect() method accumulate elements of any Stream into a Collection,
it is a part of stream terminal operation.
The Collector class provides different methods like toList(), toSet(), toMap(), and toConcurrentMap() to collect the result of Stream into
List, Set, Map, and ConcurrentMap in Java.
It accepts a Collector in arguments to accumulate elements of Stream into a specified Collection
such as stream().collect( Collectors.toList() ), stream().collect( Collectors.toSet() ),
stream().collect (Collectors.toMap(....) ), stream().collect( Collectors.counting()) etc..
It is used to perform different types of reduction operations such as
calculating the sum of numeric values, finding the minimum or maximum number,
concatenating strings to new string, collecting elements into a new stream.
//Importing required classes import java.util.ArrayList; import java.util.List; import java.util.stream.*; public class CollectMethodExample { public static void main(String args[]){ //Declaring List List<
Integer
> intList = new ArrayList<
Integer
>(); intList.add( 6 ); intList.add( 11 ); intList.add( 14 ); intList.add( 7 ); intList.add( 3 ); intList.add( 2 ); intList.add( 16 ); intList.add( 15 ); intList.add( 19 ); List<
Integer
> evenNumList = intList.stream() .filter( i-> i%2==0 ) //Converting filtered data to a List .collect( Collectors.toList() ); System.out.println( "Even Num List" +evenNumList ) } } Output: Even Num List [6, 14, 2, 16]
The Streaming process allows parallel execution of the data, where one record executes without waiting for the output of the previous record. In a distributed system normally we see the streaming process and parallel execution to simplify the task and improve performance. Its executes the thread/data without waiting for another.
Apache Kafka is an open source platform and distributed messaging system with publish and subscribe events. It is a mediator between the producer and consumer systems. The producer send data to the Kafka, where consumer consumes the data from Kafka.
Basically Apache Kafka is a Messaging system to pass messages between applications, the messaging sytem means its simple exchanging of messages between two or more sources, end-points, servers etc. It can handle high volume of data and enables us to send messages from one source to another source.
Kafka was originally developed by LinkedIn and later it was donated to the Apache Software Foundation.
Kafka Complete Tutorial...
The Kafka cluster is a group of Kafka brokers that work together to handle the data or messages for Kafka system. Each broker is a separate process that runs on a different machine and communicates with other brokers through a network.
Kafka cluster contains ZooKeeper for maintaining their cluster state.
The kafka topic is a category or a common name used to store and publish of stream of messages/data. Basically, data or messages stores in topics and messages are sent to specific topic and read from specific topic. Each topic has a name that should be unique across the entire Kafka cluster.
A topic can be splited into several parts which are known as the partitions of the topic. We need to specify the number of partitions while creating a topic. The message/data stored into a partition with an incremental id known as its Offset value.
A single Kafka server is called a Kafka Broker. The Kafka Broker is the container of topics, it holds several topics with their multiple partitions. Each broker is a separate process that runs on a different machine and communicates with other brokers through a network.
A cluster can have multiple brockers and each broker is identified by a unique numeric ID and the brocker contains certain topic partitions. All the topic partitions data is Distributed across all brokers(load balanced).
The main purpose of the Kafka Consumer is to read data from a topic. The Kafka consumer fetches the messages from topic's partitions. The Kafka Consumer required Deserializer to transform bytes to object to identify key and value of messages.
If a consumer reads data from a topic's partition, data/messages comes in order way such as message 0, message 1, message 2, message 3, message 4, message 5 and so on.
If a consumer reads data from a multiple topic's partitions, there is no ordering across partitions.
The kafka Producer is a client application that write messages and sends events to a Kafka cluster. A Kafka producer sends messages to a topic, where messages are distributed to partitions according key hashing process.
Stream API are designed to be efficient and can support improving the performance of programs. We can avoid unnecessary loops and iterations in our programs by using Stream functionality. Streams can be used for filtering, collecting, printing, and converting data from one structure to another such as one collection to another collection.
Example of filterig data using stream API
import java.io.*; import java.util.*; import java.util.stream.*; public class StreamFilterExample { static void main (
String
args[]){ int[] numbers = { 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 }; //Filtering the even numbers from a collection List<
Integer
> evenList = Arrays.asList(numbers).stream() .filter(i -> i % 2 == 0) .collect(Collectors.toList()); System.out.println( "Even numbers after stream operation : "+ evenList ); } } Output: Even numbers after stream operation : [2, 4, 6, 8, 10]
An Interface that contains exactly one abstract method is known as functional interface. It can have any number of default and static methods but can contain only one abstract method. Functional Interface is the approach of functional programming in Java. It is also known as SAM interfaces Single Abstract Method Interfaces.
/*@FunctionalInterface*/
//It is optionalpublic interface MultipleInterface {
public int multiple(int x,int y); } public class Calculate{ public static void main (String
[] args) {
//with lambdaMultipleInterface m = (x, y) -> { return x * y; }; int multipleVal = m.multiple(10,30); System.out.println( "Multiple Value is "+multipleVal); } } Output: Multiple Value is 300
@FunctionalInterface annotation is optional but it is used to ensure that the functional interface can’t have more than one abstract method. If you add more than one abstract methods, an "Unexpected @FunctionalInterface annotation" message you can get. It is not mandatory to use this annotation.
Example of functional interface with @FunctionalInterface annotation
//It is optional@FunctionalInterface interface Area { int calculateArea(int x); } class Square { public static void main (
String
[] args) { //with lambda Area a = (x) -> { return x * x; }; int areaVal = a.calculateArea(20); System.out.println( "Area of Square is: "+areaVal); } } Output: Area of Square is: 400
The consumer interface of the functional interface accepts only one single argument without any return value. The consumer interface has no return value. It contains an abstract accept() and a default andThen() method. It can be used as the assignment target for a lambda expression or method reference. Example of Consumer Interface..
Predicate accepts an argument and returns a boolean value. Predicate functional interface in Java is a type of function that accepts a single argument and returns a boolean (True/ False). It provides the functionality of filtering, it filters a stream components on the base of a provided predicate. Example of predicate interfaces..
The function type functional interface receives a single argument, processes it, and returns a value.
Functional interface is taking the key from the user as input and searching for the value in the map
for the given key.
public static void main (
String
[] args) { private static HashMap<
Integer
,
String
> Employee = new HashMap<>(); Function<
Integer
,
String
> addFunc = (Integer ID)-> { if(Employee.containsKey(ID)) { return Employee.get(ID); }else{ return "Invalid ID"; } }; // Returns the Employee object if present in the hashmap or return Invalid Id message System.out.println( addFunc.apply( 500 )); };
The Supplier functional interface is also a type of functional interface that does not take any input or argument and returns a single output. The Supplier interface takes only one generic type, the type of data it is going to return. get() is the abstract method of the Supplier.
public static void main (
String
[] args) { Supplier<
String
> supp = () -> "Hello World"; System.out.println( supp.get() ); }; Output: Hello World
Lambda expression is similar to java method, but lambda expression does not need name and it can be implemented in the body of a method.
It is an instance of functional interface. An interface with a single abstract method is called functional interface.
The lambda expression is a short block of code which takes parameters and returns a value
Example of lambda expression
/*@FunctionalInterface*/
//It is optionalpublic interface MultipleInterface { public int multiple(int x,int y); } public class Calculate
{ public static void main (String[] args) {
//with lambdaMultipleInterface m = (x, y) -> { return x * y; } int multipleVal = m.multiple(10,30); System.out.println( "Multiple Value is "+multipleVal); } Output: Multiple Value is 300
The accept() and andThen()methods are methods of Consumer Interfaces.
Stream interface provides a sorted() method to sort a list. The sorted() method returns a sorted stream according to the natural order. If the elements are not comparable, it throws java.lang.ClassCastException.
Example of stream sorted method in Java.
//Importing required classes
import java.util.*;
import java.io.*;
import java.util.stream.*;
public class StreamSortedExample {
public static void main (
String
args[]){
Integer
[] numbers = { 100, 20, 13, 440, 5, 16, 7, 28, 2, 11, 1 };
List
Integer
> sortedList =
Arrays.asList(numbers).stream()
.sorted()
.collect( Collectors.toList() );
System.out.println( ""Sorted list : "+ sortedList );
}
}
Output:
Sorted list : [1, 2, 5, 7, 11, 13, 16, 20, 28, 100, 440]
The reduce() method is a part of terminal operations in java stream. It performs operations to produce a single value as result by reducing, such as average(), sum(), min(), max(), and count() etc. The reduce() method applies the binary operator to each of the elements to reduce it to a single value. The return type of the reduce() method is type T or it can be an Optional.
Optional<
Integer
> max = intList.stream().reduce(( i, j ) -> i > j ? i : j);
Example of Java Reduce Method
//Importing required classes import java.util.ArrayList; import java.util.List; import java.util.stream.*; public class ReduceMethodExample { public static void main(String args[]){ //Declaring List List<
Integer
> intList = new ArrayList<
Integer
>(); intList.add( 6 ); intList.add( 11 ); intList.add( 14 ); intList.add( 7 ); intList.add( 3 ); intList.add( 2 ); intList.add( 16 ); intList.add( 15 ); intList.add( 19 ); Optional<
Integer
> max = intList.stream() .reduce(( i, j ) -> i > j ? i : j); System.out.println( max ) System.out.println( max.get() ) } } Output: Optional[19] 19
The Predicate test() method returns true if the input argument matches the predicate, otherwise false.
Example of Predicate test() method.
public static void main(
String
[] args){ Predicate<
Integer
> pr = a -> ( a > 10 ); // Calling Predicate method System.out.println( pr.test(20) ); } Output: true
The Predicate isEqual() method is a static method of the Predicate interface that is used to test the equality of two objects. The object can be a string, integer, or a class object.
Example of Predicate isEqual() method.
public static void mai(
String[] args) { Predicate<
String
> pr = Predicate.isEqual( "MyAllText" ); // Calling Predicate method System.out.println( pr.test( "AllText" ) ); // Calling Predicate method System.out.println( pr.test( "MyAllText" ) ); // Calling Predicate method System.out.println( pr.test( "MyAll" ) ); } Output: false true false
The Predicate and() method is used to compose two expression(lambda) that represents a short-circuiting logical AND of this predicate. It can be uses to perform logical operations by combining two lambda expressions. See the example below. Here, we used and() method to check whether a number lies between two numeric range or not.
Example of Predicate and() method
public static void main(
String[] args) { Predicate<
Integer
> pr1 = x -> (x > 20); Predicate<
Integer
> pr2 = x -> (x < 80); // Calling Predicate method System.out.println( pr1.and( pr2 ).test(100) ); // Calling Predicate method System.out.println( pr1.and( pr2 ).test(50) ); // Calling Predicate method System.out.println( pr1.and( pr2 ).test(10) ); } Output: false true false
The Predicate negate() method allows invert the result of a Predicate. It returns a predicate that represents the logical negation of this predicate.
public static void main(
String
[] args) { Predicate<
String
> startsWithCharJ = s -> s.startsWith( "H" ); Predicate<
String
> hasLengthOfInt5 = s -> s.length() == 5; System.out.println("**Before Negating**") // Calling without negate method System.out.println( startsWithCharJ.test( "Horse" ) ); // Calling without negate method System.out.println( hasLengthOfInt5.test( "India" ) ); // Here we are negating Predicate<
String
> negateStartsWithCharJ = startsWithCharJ.negate(); Predicate<
String
> negatehasLengthOfInt5 = hasLengthOfInt5.negate(); System.out.println("**After Negating**") // Calling after negate predicate System.out.println( negateStartsWithCharJ.test( "Horse" ) ); // Calling after negate predicate System.out.println( negatehasLengthOfInt5.test( "India" ) ); } Output: **Before Negating** true true **After Negating** false false
The Predicate or() method works similarly to and(), but it returns true if either of the any Predicates return true.
Example of Predicate or() method
public static void main(
String
[] args) { Predicate<
String
> startsWithCharH = s -> s.startsWith( "H" ); Predicate<
String
> hasLengthOfInt5 = s -> s.length() == 5; Predicate<
String
> startsWithCharHOrHasLengthOf5 = startsWithCharH.or( hasLengthOfInt5 ); System.out.println( startsWithCharHOrHasLengthOf5 .test( "Hero" ) ); System.out.println( startsWithCharHOrHasLengthOf5 .test( "India" ) ); System.out.println( startsWithCharHOrHasLengthOf5 .test( "Method" ) ); } Output: true true false
The forEach() method is a default method defined in the Iterable and Stream interface. It is a terminal operation in the stream API. The purpose of the forEach() method is iterating the elements one by one of a collection to create a result. The forEach() method accept a lambda expression in argument.
//Importing required classes import java.util.ArrayList; import java.util.List; public class ForEachExample { public static void main(
String
args[]){ //Declaring List List<
String
> countryList = new ArrayList<
String
>(); countryList.add( "Afghanistan" ); countryList.add( "Albania" ); countryList.add( "Algeria" ); countryList.add( "Australia" ); countryList.add( "India" ); countryList.add( "Brazil" ); countryList.add( "Japan" ); countryList.add( "Italy" ); //forEach method to printing country names countryList.forEach( country -> System.out.println( country ) ); } } Output: Afghanistan Albania Algeria India Australia Brazil Japan Italy
The stream count() method is a part of terminal operations in java stream. The count() method returns the count of elements in the stream. The return type of count() method is long. It is the reduction process, it may traverse the stream to produce a result.
//Importing required classes import java.util.ArrayList; import java.util.List; import java.util.stream.*; public class CountMethodExample { public static void main(String args[]){ //Declaring List List<
Integer
> intList = new ArrayList<
Integer
>(); intList.add( 6 ); intList.add( 11 ); intList.add( 14 ); intList.add( 7 ); intList.add( 3 ); intList.add( 2 ); intList.add( 16 ); intList.add( 15 ); intList.add( 19 ); //calling count method long count = intList.stream().count(); System.out.println( count ) } } Output: 9
The findFirst() method is a part of Short circuit operation in Java Stream. It returns an Optional 1st element of the stream or empty if the stream is empty.
T
> findFirst().
//Importing required classes
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
public class FindFirstExample {
public static void main(String args[]) {
List<String
> strList = new ArrayList<String
>();
strList.add("One");
strList.add("Two");
strList.add("Three");
strList.add("Four");
strList.add("Five");
Optional<String
> firstElement = strList.stream().findFirst();
System.out.println(firstElement);
System.out.println(firstElement.get());
}
}
Output:
Optional[One]
One
ConcurrentHashMap is an enhancement of HashMap. It helps to improve the performance with threads in an application. ConcurrentHashMap is a thread-safe, multiple threads can access it simultaneously without any synchronization issues. ConcurrentHashMap introduced in JDK 1.5 and available in the java.util.concurrent package.
Example of ConcurrentHashMap in Java
//Importing required classes import java.util.concurrent.ConcurrentHashMap; public class ConcurrentHashMapExample { public static void main(String args[]){ ConcurrentHashMap
map = new ConcurrentHashMap< String, Integer>(); map.put("A1", 100); map.put("A2", 200); map.put("A3", 300); System.out.println("Map size is " + map.size()); int valueA1 = map.get("A1"); System.out.println("Value of A1 is " + valueA1); map.remove("A3"); System.out.println("Map size " + map.size()); } } Output: Map size is 3 Value of A1 is 100 Map size 2
HashMap is introduced in 1.2. HashMap in Java stores the object in Key - Value pairs, the objects are accessible by key or an index. If you try to insert the duplicate key, it will replace the value of old with new value of that key.
HashMap contains an array of Nodes and Node is a class with below attributes.
Hashmap uses the Hashing process to store the objects in a Bucket. First, you need to understand about the Hashing process and Bucket.
Hashing is a process of converting an object into integer form by using the method hashCode().
Array of the node is called buckets. Each node has a data structure like a LinkedList. More than one can be in the same bucket. The every object refer to a node to store in the HashMap and each nodes refer to the buckets. So, basically buckets is the container of nodes.
This is the method of the object class. It returns an integer value of memory reference of the object. The hashCode() returns an integer value of memory reference of the object and that is used to store object in a bucket in the Hashmap.
When we put an object into HashMap, suppose hascode() method returned a value that is already exist inthe bucket in the HashMap then equals() method comes in the process to compares the Key with existing object's key, whether they are equal or not. If equal then simply HashMap replace the value of old with new value with the same key. Otherwise, both objects will be stored in the same node sing LinkedList data structure and next will be a pointer to another node.
Index = hashcode(Key) & (n-1). Interally uses bitwise AND operator to convert binary to decemal.
Example:
Suppose, binary value of hashcode(Key) & (n-1) is 1010
1. length of 1010=4
2. decimal = 1*(2*2*2) + 0*(2*2) + 1*(2) + 0;
3. decimal = 8+0+2+0
4. decemial = 10
Object will be stored in the 10th bucket.
A Hashtable is an array of a list. Each list is known as a bucket. Hashtable class contains unique elements, It doesn't allow null key or null value Hashtable class is synchronized. The default capacity of Hashtable class is 11 and loadFactor is 0.75.
A Hashtable is an array of a list. Each list is known as a bucket. Hashtable class contains unique elements, It doesn't allow null key or null value Hashtable class is synchronized. The default capacity of Hashtable class is 11 and loadFactor is 0.75.
An enum is a special class that represents a group of constants.
import java.util.*;
enum months {
JANUARY, FEBRUARY, MARCH, APRIL, MAY, JUNE, JULY, AUGUST, SEPTEMBER, OCTOBER, NOVEMBER, DECEMBER
}
Map m = Collections.synchronizedMap(hashMap);
EnumSetprovides the functionality of Set implementation for use with enum types.
import java.util.*; enum months { JANUARY, FEBRUARY, MARCH, APRIL, MAY, JUNE, JULY, AUGUST, SEPTEMBER, OCTOBER, NOVEMBER, DECEMBER } public static void main(String[] args) { Set<
months> set = EnumSet.of(months.JANUARY, months.FEBRUARY, months.MARCH); Iterator<
months> iter = set.iterator(); while (iter.hasNext()) System.out.println(iter.next()); } Output: JANUARY FEBRUARY MARCH
EnumMap provides the functionality of Map implementation with enum type of keys.
import java.util.*; enum Months { JANUARY, FEBRUARY, MARCH, APRIL, MAY, JUNE, JULY, AUGUST, SEPTEMBER, OCTOBER, NOVEMBER, DECEMBER } public static void main(String[] args) { EnumMap<
Months, String> map = new EnumMap<
Months, String>(Days.class); map.put(Months.JANUARY, "1"); map.put(Months.FEBRUARY, "2"); map.put(Months.MARCH, "3"); map.put(Months.APRIL, "4"); map.put(Months.MAY, "5"); }
Immutable class in java means, once an object is created we cannot change its value.
The wrapper classes such as Integer, Boolean, Byte, Short and String classes are immutable.
Rules for the creating custom Immutable class in Java.
Example of custom Immutable class in Java.
import java.util.Date; import java.util.HashMap; import java.util.Map; final class BookAuthor { private final Integer authorId; private final String authorName; private final Date dob; private final Map<
Integer, String> bookMap = new HashMap<
Integer, String>(); BookAuthor(Integer authorId, String authorName, Map<
Integer, String> bookMap, Date dob) { this.authorId = authorId; this.authorName = authorName; //Cloning Copy this.dob = (Date) dob.clone(); //Deep Copy for (var entry : bookMap.entrySet()) { this.bookMap.put(entry.getKey(), entry.getValue()); } } public Integer getAuthorId() { return authorId; } public String getAuthorName() { return authorName; } public Date getDob() { return dob; } public Map<
>Integer, String> getBookMap() { Map<
Integer, String> tempMap = new HashMap<
Integer, String>(); for (var entry : this.bookMap.entrySet()) { tempMap.put(entry.getKey(), entry.getValue()); } return tempMap; } }
public class ComparableExample implements Comparable<
ComparableExample>{ private Integer id; private String name; private String email; private Integer age; @Override //Override compareTo method public int compareTo(ComparableExample obj) { if(age==obj.age) return 0; else if(age>obj.age) return 1; else return -1; } }
import java.util.Comparator; public class Employee { private Integer id; private String name; private String email; private Integer age; public Employee(Integer id, String name, String email, Integer age) { this.id = id; this.name = name; this.email = email; this.age = age; } public Integer getId() { return id; } public String getName() { return name; } public String getEmail() { return email; } public Integer getAge() { return age; } } //Seperate class need to create class DemoComtarorExample implements Comparator<
Employee> { @Override //Override the compare method public int compare(Employee obj1, Employee obj2) { if(obj1.getAge() == obj2.getAge()) return 0; else if(obj1.getAge() > obj2.getAge()) return 1; else return -1; } }
LinkedList is a linear data structure where the elements are not stored in contiguous locations and every element is a separate object with a pointer of address of another object. LinkedList class uses doubly linked list to store the elements.
import java.io.IOException;
import java.util.LinkedList;
public class LinkedListExample {
public static void main(String[] args) throws IOException {
LinkedList ll = new LinkedList();
ll.add("A1");
ll.add("A2");
ll.addLast("A3");
ll.addFirst("A4");
ll.add(1, "A5");
System.out.println(ll);
//ll.pop();pop the first element
//ll.poll(); poll and remove the head element
ll.add("A6");
ll.remove("A2");
//ll.remove(2);
ll.removeFirst();
ll.removeLast();
System.out.println(ll);
}
}
Output:
[A4, A5, A1, A2, A3]
[A5, A1, A3]
Java Spring Boot is an open source lightweight framework that helps to develop application easyly as it provides minimum configuration, annotation base configuaration, no need to add much configuration such as if we need to create datasource just need to add few values in the properties file.
It has embeded HTTP server, we dont need to create a seperate server to rn the application.
Spring Boot Actuator is a module that provides production-ready features to monitor and manage the Spring Boot application. It provides many endpoints that used for monitoring, health checks, auditing, and managing the application.
Swagger is used for the detailing and documenting of RESTful APIs.
JPA stands for Java Persistence API. JPA is used to persist data between Java object and relational database. The JPA is only a specification, needs ORM tools like Hibernate, iBatis, and TopLink implements the JPA specification and perform these type of tasks.
Java Persistence API is a collection of classes and methods to persist or store a vast amount of data in a database using ORM.
Spring Data JPA is a part of JPA and it reduces the amount of boilerplate code needed for common database operations like GET, PUT, POST, etc.
Spring Repository is an extension of Spring Repository which contains APIs for basic CRUD operations, pagination, and Sorting.
HQL stands for Hibernate Query Language it allows to expression of database queries using entity and property names.
Session session = sessionFactory.openSession(); try { String hql = "FROM Employee WHERE department.id = :deptID"; Query<
Employee> query = session.createQuery(hql, Employee.class); query.setParameter("deptID", 1); List<
Employee> empList = query.list(); for (Employee employee : empList) { System.out.println("Employee Name: " + employee.getName()); } } catch (Exception e) { e.printStackTrace(); } finally { session.close(); }
Session session = sessionFactory.openSession(); try { Criteria criteria = session.createCriteria(Employee.class); criteria.add(Restrictions.eq("city", "City Name")); criteria.add(Restrictions.between("age", 30, 40)); List<
Employee> empList = criteria.list(); for (Employee emp : empList) { System.out.println("Employee Name: " + emp.getName()); } } catch (Exception e) { e.printStackTrace(); } finally { session.close(); } sessionFactory.close();
Hibernate is open source lightweight object relational mapping tool. It provides the functionalty to map between Entity object with database table. It makes automatic database operations like create, read, update, delete operations.
Hibernate Session is an interface between application and database,
session is required to create session factory.
SessionFactory in Hibernate is to create and manage the session instances,
database connection,
The main purpose of indexing is to improves database performance. An index is a database structure that you can use to improve the performance of database activity. A database table can have one or more indexes.
A database driver can use indexes to find records quickly. While an index speeds up the performance of data retrieval queries (SELECT statement), it slows down the performance of data input queries (UPDATE and INSERT statements).
Unique indexes are used for data integrity as well as for for performance.
Example of Unique Index.
CREATE UNIQUE INDEX index_name
on table_name (column_name);
A single-column index is created only on one table column.
Example of Single-Column Index.
CREATE INDEX index_name
ON table_name (column_name);
A composite index is an index that can be created on two or more columns of a table.
Example of Composite Index.
CREATE INDEX index_name
on table_name (column1, column2);
Implicit indexes are indexes that are automatically created by the database server when
an object is created such as primary key and unique constraints.
Views in SQL are a kind of virtual table. A view also has rows and columns as they are on a real table in the database. We can create a view by selecting fields from one or more tables present in the database. A View can either have all the rows of a table or specific rows based on certain conditions.
Normalization reduce data redundancy and enhance data integrity in the table. It also helps to organize the data in the database by removing duplicated data from the relational tables.
Column should contain only a single value, and each column should have a unique name.
Each row should be associated with a primary key.
3NF rule is followed in the BCNF.
Denormalization is is used to optimization or increase performance. In Denormalization, need to add redundant data to one or more tables.
Constraint in SQL is a set of rules we are applying the restrictions on the database. The rule checks before inserting data into the database such as NOT NULL, UNIQUE, PRIMARY KEY, FOREIGN KEY etc.
Data integrity is the and process of ensure the accuracy, completeness, consistency, and validity of an organization's data.
SELECT * FROM Employee WHERE name like 'M%';
SELECT * FROM Employee WHERE name like 'M%';
SELECT upper(NAME) as EMPLOYEE_NAME from employee;
SELECT distinct city from EMPLOYEE;
OR
SELECT CITY FROM EMPLOYEE GROUP BY(CITY);
SELECT SUBSTRING(NAME, 1, 2) from EMPLOYEE;
SELECT * FROM EMPLOYEE ORDER BY NAME , CITY DESC;
SELECT * FROM EMPLOYEE where name like '%v';
SELECT CITY, COUNT(*) AS COUNT_CITY FROM EMPLOYEE where CITY='Mumbai';
SELECT CONCAT(first_name, ' ', last_name) as FULL_NAME FROM EMPLOYEE;
SELECT CITY, COUNT(city) FROM EMPLOYEE group by CITY order by COUNT(city);
SELECT EMPLOYEE.FIRST_NAME, EMPLOYEE.LAST_NAME
SALARY.SALARY_AMOUNT, SALARY.SALARY_DATE
FROM EMPLOYEE
INNER JOIN
SALARY ON EMPLOYEE.ID=SALARY.ID;
SELECT * FROM Employee ORDER BY salary DESC LIMIT 2, 1;
CREATE TABLE new_table_name AS SELECT * FROM Employee;
SELECT AVG(salary) FROM employee;
SELECT CITY, AVG(salary) FROM employee GROUP BY CITY;
SELECT CITY FROM employee GROUP BY CITY HAVING COUNT(CITY) = 1;
The Singleton Design Pattern provides only one instance of a class. Application can access the instance of Singleton class from multiple places.
Example of Singleton class in Java.
public class SingleToneDesignPatternExample {
private static SingleToneDesignPatternExample obj;
private SingleToneDesignPatternExample(){}
private static SingleToneDesignPatternExample getSInstance(){
if(obj==null){
synchronized (SingleToneDesignPatternExample.class){
obj = new SingleToneDesignPatternExample();
}
}
return obj;
}
}
The SOLID design pattern principles are below.
Jenkins is a CICD (continuous integration and continuous delivery) tool that helps for the delivery of projects. Jenkins install on a server where we acn do the central build procees.