Friday, 27 December 2024

List and change into diffrent shell.

list all shell in moc os

cat /etc/shells

print current shell 

echo $SHELL;

witch to bash shell

chsh  -s /bin/bash

then close your terminal and reopen it.


Monday, 23 December 2024

MVL rule engine

 https://medium.com/@er.rameshkatiyar/implement-your-own-rule-engine-java8-springboot-mvel-5928474e1ba5



Thursday, 19 December 2024

project code execution flow

 1 step .idl -> create method -> then genertae struct like resquest and response boady

2 step event file -> declare method name like ignore or full 

3 step then barrister will generate interface and resquest and response file in code base

4 step then implement interface method and write your own logic 

5 step then pack you logic response into barrister generated response payload

6 step excpetion-> exception handling using barrister

7 step write test cases 


Tuesday, 10 December 2024

JDBI JOIN example

 import org.jdbi.v3.core.Jdbi;

import org.jdbi.v3.core.handle.Handle;

import org.jdbi.v3.sqlobject.SqlObjectPlugin;

import java.math.BigDecimal;

import java.sql.Date;

import java.util.List;


public class Main {

    public static void main(String[] args) {

        // Create a Jdbi instance using your database connection

        Jdbi jdbi = Jdbi.create("jdbc:h2:mem:test;DB_CLOSE_DELAY=-1");

        jdbi.installPlugin(new SqlObjectPlugin());


        // Use a handle to execute queries

        try (Handle handle = jdbi.open()) {

            // Create tables

            handle.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name VARCHAR)");

            handle.execute("CREATE TABLE orders (id INTEGER PRIMARY KEY, user_id INTEGER, order_date DATE, FOREIGN KEY (user_id) REFERENCES users(id))");

            handle.execute("CREATE TABLE products (id INTEGER PRIMARY KEY, name VARCHAR, price DECIMAL)");

            handle.execute("CREATE TABLE order_details (id INTEGER PRIMARY KEY, order_id INTEGER, product_id INTEGER, quantity INTEGER, FOREIGN KEY (order_id) REFERENCES orders(id), FOREIGN KEY (product_id) REFERENCES products(id))");


            // Insert test data

            handle.execute("INSERT INTO users (id, name) VALUES (1, 'John Doe')");

            handle.execute("INSERT INTO orders (id, user_id, order_date) VALUES (1, 1, '2023-01-01')");

            handle.execute("INSERT INTO products (id, name, price) VALUES (1, 'Product A', 10.0)");

            handle.execute("INSERT INTO order_details (id, order_id, product_id, quantity) VALUES (1, 1, 1, 2)");


            // Fetch order summaries

            String sql = "SELECT u.name AS userName, o.order_date AS orderDate, p.name AS productName, od.quantity " +

                         "FROM users u " +

                         "JOIN orders o ON u.id = o.user_id " +

                         "JOIN order_details od ON o.id = od.order_id " +

                         "JOIN products p ON od.product_id = p.id";


            List<OrderSummary> summaries = handle.createQuery(sql)

                .mapToBean(OrderSummary.class)

                .list();


            for (OrderSummary summary : summaries) {

                System.out.println("User: " + summary.getUserName() +

                                   ", Order Date: " + summary.getOrderDate() +

                                   ", Product: " + summary.getProductName() +

                                   ", Quantity: " + summary.getQuantity());

            }

        }

    }

}


Spring 4 tutorial

 https://www.dineshonjava.com/spring-4-tutorials-step-to-new-spring/

Bazel cheet set

 https://www.hackingnote.com/en/cheatsheets/bazel/


# Single target.
$ bazel build //go/util:util

# All targets under a directory and any subdirectoriews.
$ bazel build //go/...

# All targets in the repository.
$ bazel build //...

Monday, 9 December 2024

JDBI tutorial advance version of JDBC

 https://jdbi.org/

https://jdbi.org/jdbi2/

Barrister RPC

 http://barrister.bitmechanic.com/docs.html

Java program on string manipulation

 String name="pawanpawan";

Map<Character, Long> charCount=name.chars().mapToObj(c->(char)c).collect(Collectors.groupingBy( s-> s,Collectors.counting()));
System.out.println(name.chars().distinct().mapToObj(c-> (char)c).map(String::valueOf).collect(Collectors.joining()));
charCount.entrySet().stream().forEach(System.out::println);
Map<Character,Integer> characterCount = new HashMap<>();
for( Character ch:name.toCharArray()) {
System.out.println("ch"+ch);
if(characterCount.containsKey(ch)) {
System.out.println("ch"+ch);
characterCount.put(ch,characterCount.get(ch)+1);
} else{
characterCount.put(ch,1);
}
}
System.out.println("char count"+characterCount.toString());

Sunday, 29 September 2024

Singleton class with lazy and unbreakable

Lazy initialization single ton class

package com.example.practice.student.service;

public class SingletonClassExample {
private static volatile SingletonClassExample singletonClassExample;

private SingletonClassExample() {
}

public static synchronized SingletonClassExample getInstance() {
if (singletonClassExample == null) {
System.out.println("pawankumar");
return singletonClassExample= new SingletonClassExample();
//return singletonClassExample = new SingletonClassExample();
}
return singletonClassExample;
}

public void displayMsg() {
System.out.println("call from singleton class");
}

public static void displayMsg1() {
System.out.println("call from singleton class");
}
}


For Eager Initialization on declaration on instance variable  do assignment of object 


private static volatile SingletonClassExample singletonClassExample= new SingletonClassExample();



 unbreakable singleton class



package com.example.practice.student.service;

import java.io.Serializable;

public class UnbreakableSingClass implements Serializable,Cloneable {

private static UnbreakableSingClass unbrak;
private static boolean objectCreated=false;


private UnbreakableSingClass UnbreakableSingClass() {
if(objectCreated) {
throw new RuntimeException("objet");
}else {
return unbrak;
}
}

public static UnbreakableSingClass getIntanance() {
if(unbrak==null) {
synchronized (UnbreakableSingClass.class) {
if(unbrak==null) {
return unbrak=new UnbreakableSingClass();
}
}
}
return unbrak;
}

@Override
protected Object clone() throws CloneNotSupportedException {
throw new CloneNotSupportedException("Cloning of this singleton is not allowed.");
}

protected Object readResolve() {
return getIntanance();
}

}

Thursday, 19 September 2024

Java Collection methods.

Collection framework 

List Interface:-

List Interface extends Collection Interface
Collection Interface extends Iterable Interface

1 Iterable Interface Methods

Iterator:-
forEach:-



Collection interface methods:-

List Interface methods:- 





Set interface method:-



Map interface methods:-





Thursday, 12 September 2024

Class and inner class example

 public class OuterClass2 {

    private String outerField = "Outer Class Field";

public void outerMethod() {
// Local inner class inside a method
class LocalInnerClass {
public void display() {
System.out.println("Accessing outer class field from local inner class: " + outerField);
}
}

// Creating an instance of the local inner class inside the method
LocalInnerClass localInner = new LocalInnerClass();
localInner.display();
}

public static void main(String[] args) {
OuterClass2 outer = new OuterClass2();
outer.outerMethod();
}
}

Thursday, 22 August 2024

lambda java

 https://www.javaprogramto.com/2019/06/java8-lambda-expressions.html

Wednesday, 21 August 2024

Interview preparation Topic

 Object class and its method.

Class and object

equal() and Hash()

String vs String()

StringBuilder vs String Buffer.

Opps Pilor

Interface

Abstraction


functional interview question 

consumer

supplyer

function

Bifunctional 

Predicate<Integer>isEven = x -> x%2==0;
Predicate<Integer>isNotNull = x->x!=null;
System.out.println(isNotNull.and(isEven).test(null));
System.out.println(isEven.test(2));
System.out.println(isEven.test(3));

Supplier<String>supplier = () -> String.valueOf(UUID.randomUUID());
System.out.println(supplier.get());

Consumer<String> toUpperCase = x->System.out.println(x.toUpperCase());
toUpperCase.accept("Hello, world!");


Function<String,Integer>function = x->x.length();
System.out.println(function.apply("test"));


BiFunction<String,String,Integer>biFunction = ( a, b) -> a.length() + b.length();
System.out.println(biFunction.apply("ravi","bhushan"));

BiConsumer<String,String>biConsumer = (a,b)-> System.out.println(a+b);
biConsumer.accept("lll","kkk");


BiSupplierWithReturn biSupplierWithReturn = (a,b)-> List.of(a,b);
System.out.println(biSupplierWithReturn.apply(1,3));


Friday, 9 August 2024

Websocket programing

 https://www.toptal.com/java/stomp-spring-boot-websocket

https://spring.io/guides/gs/messaging-stomp-websocket

Thursday, 8 August 2024

Monday, 5 August 2024

strategy design pattern example in java

 

The Strategy Design Pattern is a behavioral pattern that enables us to select an algorithm’s behavior at runtime. This pattern lets us define a set of algorithms, place them in different classes, and makes them interchangeable [1].

This is just a definition but let’s get a better understanding by knowing the problem that we are trying to solve.

The Problem

Let’s say you are working on a feature called File Parser. You need to write an API where you can upload a file and our system should be able to extract the data from it and persist them in the database. Currently we are asked to support CSV, JSON and XML files. Our immediate solution would look something like below.

 

@Service
public class FileParserService {

public void parse(File file, String fileType) {
if (Objects.equals(fileType, "CSV")) {
// TODO : a huge implementation to parse CSV file and persist data in db
} else if (Objects.equals(fileType, "JSON")) {
// TODO : a huge implementation to parse JSON file and persist data in db
} else if (Objects.equals(fileType, "XML")) {
// TODO : a huge implementation to parse XML file and persist data in db
} else {
throw new IllegalArgumentException("Unsupported file type");
}
}

}
 

Everything looks good now from the business perspective but things will start getting uglier when we want to support more file types in the future. We start adding multiple else if blocks and the size of the class will quickly grow which will eventually become too hard to maintain. Any change to one of the implementations of the file parser will affect the whole class thereby increasing the chance of introducing a bug in an already working functionality.

Not only that, but there is another problem. Let’s say now we need to additionally support sqlite and parquet file types. Two developers will step in and they will start working on the same huge class. It is highly likely that they will get merge conflicts which is not only irritating for any developer but also time consuming to resolve them. Most importantly, even after the conflict resolution, there would be decreased confidence in terms of the feature working as a whole.

 

The Solution

This is where the Strategy Design pattern steps in to our rescue. We will move all the file parser implementations to separate classes called strategies. In the current class, we shall dynamically fetch the appropriate implementation based on file type and execute the strategy.

Here’s a UML diagram to provide a high-level overview of the design pattern that we are about to implement.

 

Now, let’s just dive into the code.

We will need a class to maintain different file types supported. Later we will use this to create spring beans (i.e. strategies) with custom names.

public class FileType {
public static final String CSV = "CSV";
public static final String XML = "XML";
public static final String JSON = "JSON";
}

Create an interface for our File Parser

public interface FileParser {
void parse(File file);
}

Now that we have created an interface, let’s create different implementations for different file types i.e. strategies

@Service(FileType.CSV)
public class CsvFileParser implements FileParser {

@Override
public void parse(File file) {
// TODO : impl to parse csv file
}

}
@Service(FileType.JSON)
public class JsonFileParser implements FileParser {

@Override
public void parse(File file) {
// TODO : impl to parse json file
}

}
@Service(FileType.XML)
public class XmlFileParser implements FileParser {

@Override
public void parse(File file) {
// TODO : impl to parse xml file
}

}

Notice that we have given custom names for the above beans which will help us inject all these three beans to our required class.

Now we need to find a way to choose one of the above implementations based on file type during runtime.

Let’s create a FileParserFactory class. This class is responsible in deciding which implementation to choose given a file type. We will leverage spring boot’s awesome dependency injection feature to fetch the appropriate strategy during runtime. (Refer the comments in the below code block for more details or [2])

@Component
@RequiredArgsConstructor
public class FileParserFactory {

/**
* Spring boot's dependency injection feature will construct this map for us
* and include all implementations available in the map with the key as the bean name
* Logically, the map will look something like below
* {
* "CSV": CsvFileParser,
* "XML": XmlFileParser,
* "JSON": JsonFileParser
* }
*/

private final Map<String, FileParser> fileParsers;

/**
* Return's the appropriate FileParser impl given a file type
* @param fileType one of the file types mentioned in class FileType
* @return FileParser
*/

public FileParser get(String fileType) {
FileParser fileParser = fileParsers.get(fileType);
if (Objects.isNull(fileParser)) {
throw new IllegalArgumentException("Unsupported file type");
}
return fileParser;
}

}

Now, let’s make changes to our FileParserService. We will use our FileParserFactory to fetch the appropriate FileParser based on the fileType and call the parse method.

@Service
@RequiredArgsConstructor
public class FileParserService {

private final FileParserFactory fileParserFactory;

public void parse(File file, String fileType) {
FileParser fileParser = fileParserFactory.get(fileType);
fileParser.parse(file);
}

}

That’s it. We are done!

 

 

 

Flexbox tutorial

 https://css-tricks.com/snippets/css/a-guide-to-flexbox/

Thursday, 1 August 2024

@JsonProperty at DAO layer

 @JsonProperty at DAO layer will not work some time.

If you want to map data in proper way use. Bean field name as same as table name.

If your columan name is player_name in table 

and in bean if you mention like this 

@JsonBeanProperty("player_name")

private String playerFirstName 


then data will not mapp .

Make it same as player_name just remove _ with camplecase .



@NoArgsConstructor
@AllArgsConstructor
@Getter
@Setter
@Builder
public class DummyClass {
@JsonProperty("leaderboard_requested_to_setcurrent")
private String leaderboardRequestedToSetCurrent;

@JsonProperty("workos_conn_id")
private String workConnId;

@JsonProperty("workos_org_id")
private String workOrgId;

@JsonProperty("workos_sso_enable")
private Integer workSsoEnable;
}
when I am using @JsonProperty.
jdbcTemplate.queryForObject(sql, new BeanPropertyRowMapper<>(DummyClass.class), companyId);
I am getting null but when I am using

@NoArgsConstructor
@AllArgsConstructor
@Getter
@Setter
@Builder
public class DummyClass {

private String leaderboard_requested_to_setcurrent;


private String workos_conn_id;

private String workos_org_id;

private Integer workos_sso_enable;
}

 

 

 

 

Working class
@NoArgsConstructor
@AllArgsConstructor
@Getter
@Setter
@Builder
public class DummyClass {

@JsonProperty("leaderboard_requested_to_setcurrent")
private String leaderboardRequestedToSetCurrent;

@JsonProperty("workos_conn_id")
private String workosConnId;

@JsonProperty("workos_org_id")
private String workosOrgId;

@JsonProperty("workos_sso_enable")
private Integer workosSsoEnable;

@JsonProperty("leaderboard_scheduled_on")
private String leaderboardScheduledOn;

}

 

 

 

 

 

 


Thursday, 25 July 2024

Java JWT example

 

package com.codewalla.hrms.authentication.controller;

import io.jsonwebtoken.Claims;
import io.jsonwebtoken.Jws;
import io.jsonwebtoken.SignatureAlgorithm;
import io.jsonwebtoken.Jwts;
import io.jsonwebtoken.security.Keys;

import java.security.Key;
import java.security.SecureRandom;
import java.util.Base64;
import java.util.Date;

public class JWTExample {
public static void main(String[] args) {
int keyLengthBytes = 64;
SecureRandom secureRandom = new SecureRandom();
byte[] keyBytes = new byte[keyLengthBytes];
secureRandom.nextBytes(keyBytes);

String secretKey = Base64.getUrlEncoder().withoutPadding().encodeToString(keyBytes);

String jws = Jwts.builder()
.setSubject("user123")
.setIssuer("example.com")
.setIssuedAt(new Date())
.setExpiration(new Date(System.currentTimeMillis() + 3600000)) // 1 hour expiration
.signWith(SignatureAlgorithm.HS256,secretKey)
.compact();

System.out.println("Generated JWT: " + jws);

try {
Jws<Claims> claimsJws = Jwts.parser()
.setSigningKey(secretKey)
.parseClaimsJws(jws);
Claims claims = claimsJws.getBody();
System.out.println("Subject: " + claims.getSubject());
System.out.println("Issuer: " + claims.getIssuer());
System.out.println("Expiration: " + claims.getExpiration());
} catch (Exception e) {
System.out.println("Invalid JWT: " + e.getMessage());
}
}
}

communication between two docker container

 

To enable communication between two Docker containers, you can use Docker networks. Docker networks allow containers to discover and communicate with each other. Here are the steps to set up communication between two Docker containers:

1. Create a Docker Network

First, create a custom Docker network. This will allow your containers to communicate with each other using their container names as hostnames.

sh
docker network create my_network
 

2. Run Containers on the Same Network

Run your containers and attach them to the network you just created.

For example, if you have two containers named container1 and container2:

sh
docker run -d --name container1 --network my_network my_image1
docker run -d --name container2 --network my_network my_image2

3. Verify Network Connectivity

You can verify that the containers are on the same network by running:

docker network inspect my_network
This command will show you details about the network, including the containers attached to it.

 

4. Test Communication

To test the communication between the containers, you can use the docker exec command to enter one container and ping the other.

For example, from container1, you can ping container2:

docker exec -it container1 ping container2
 

 

version: '3'
services:
container1:
image: my_image1
networks:
- my_network

container2:
image: my_image2
networks:
- my_network

networks:
my_network:
driver: bridge
 

 

 

Friday, 12 July 2024

python docker deployment

Run comand

 

 FROM python:3.9-slim
WORKDIR /app
COPY ai-quiz-service/requirements.txt .
COPY config/dev.env /app/.env
RUN pip install --no-cache-dir -r requirements.txt
COPY app/ app/
ENV PYTHONPATH=/app
EXPOSE 8088
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8088"]

 

docker build --build-arg ENV_FILE=app/config/dev.env -t myfastapi:dev .


# Use an official Python runtime as a parent image
FROM python:3.10

# Set the working directory in the container
WORKDIR /app

# Copy the current directory contents into the container at /app
COPY . /app

# Install any needed packages specified in requirements.txt
RUN pip install --no-cache-dir -r requirements.txt

# Define build argument for the environment file
ARG ENV_FILE

# Copy the environment file from the build context to the container
COPY ${ENV_FILE} /app/.env

# Make port 80 available to the world outside this container
EXPOSE 8088

# Ensure the PYTHONPATH includes the /app directory
ENV PYTHONPATH=/app

# Run app.py when the container launches
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8088"]
 

 

 

 

Friday, 21 June 2024

git pull not working

 hint: You have divergent branches and need to specify how to reconcile them.
hint: You can do so by running one of the following commands sometime before
hint: your next pull:
hint:
hint:   git config pull.rebase false  # merge (the default strategy)
hint:   git config pull.rebase true   # rebase
hint:   git config pull.ff only       # fast-forward only
hint:
hint: You can replace "git config" with "git config --global" to set a default
hint: preference for all repositories. You can also pass --rebase, --no-rebase,
hint: or --ff-only on the command line to override the configured default per
hint: invocation.
fatal: Need to specify how to reconcile divergent branches.

git config pull.rebase false

Wednesday, 19 June 2024

Maximum number of request handled by springboot appliation

 https://medium.com/@haiou-a/spring-boot-how-many-requests-can-spring-boot-handle-simultaneously-a57b41bdba6a

 

 

Concurrent Request Handling

Therefore, we conclude: By default, the number of requests that Spring Boot can handle simultaneously = maximum connections (8192) + maximum waiting number (100), resulting in 8292.

 

Java equals() and hash() example

 

public class TrophyCalculationPayload {
private List<Integer> playerIds;
private Integer companyId;
private Integer gameId;
private String gameType;
private Integer point;
private Integer time;
private Integer highScore;
private Integer gameSessionId;
private String timeZone;
private Integer servedQuestionCount;
private Integer correctAnswerCount;

@Override
public boolean equals(Object other) {
if (this == other) {
System.out.println("first if");
return true;
}
if (other == null || getClass() != other.getClass()) {
System.out.println("second if");
return false;
}

TrophyCalculationPayload otherPayload = (TrophyCalculationPayload) other;

if( Objects.equals(companyId, otherPayload.companyId) &&
Objects.equals(gameId, otherPayload.gameId)) {
System.out.println("third if");
}
return true;
}

@Override
public int hashCode() {
return Objects.hash(companyId, gameId);
}
}
 
 public static void main(String[] args) {
TrophyCalculationPayload payload= new TrophyCalculationPayload(Arrays.asList(1,2,3),1,1,"game",1,1,1,1,"tz",1,1);
TrophyCalculationPayload payload1=new TrophyCalculationPayload(Arrays.asList(1,2,3),1,1,"game",1,1,1,1,"tz",1,1);
TrophyCalculationPayload payload3=payload1;
String name="pawan";
String name1="pawan";

String name2=new String("pawan");
String name3=new String("pawan");
System.out.println("--------"+ (name==name1));
System.out.println("--------"+ name.equals(name1));
System.out.println("--------"+name2.equals(name3));
System.out.println("--------"+name.equals(name1));
System.out.println("--------"+name1.equals(name));
//
System.out.println("--------"+payload.equals(payload1));
System.out.println("--------"+payload1.equals(payload3));
System.out.println("----payload----"+payload.hashCode());
System.out.println("----payload1----"+payload1.hashCode());
 
 Note In string class equals() method overided to compare compare content 
 But Object class equals() method overided to compare references.
 
 So when we compare 
 System.out.println("--------"+ name.equals(name1));
 
it retuns true;
but when compare User defined class then it return false,because  equals()
default overided behavior is to compare object references.


 


 
 
 
 
 

Saturday, 25 May 2024

Interview question java practical

Take a string and remove duplicate chars.


 import java.util.stream.Collectors;

public class Main {

  public static void main(String[] args) {

    System.out.println("Hello world!");

    String myName="pawan";

    String removedname=removeDuplicates(myName);

      System.out.println(removedname);

    

  }

  public static String removeDuplicates(String str) {

      // Convert the string to a stream of characters, remove duplicates and collect back to a string

      return str.chars()

                .distinct()

                .mapToObj(c -> String.valueOf((char) c))

                .collect(Collectors.joining());

  }

}

Tuesday, 21 May 2024

MongoDB lookup example

 

To join three tables (or collections) in MongoDB, you can use the $lookup aggregation stage multiple times within an aggregation pipeline. This allows you to perform multiple left outer joins. Here’s an example scenario and how you can implement it.

Example Scenario

Let's assume we have three collections: orders, customers, and products.

orders Collection:

json
[ { "_id": 1, "order_id": "ORD001", "customer_id": 1, "product_id": 101, "amount": 500 }, { "_id": 2, "order_id": "ORD002", "customer_id": 2, "product_id": 102, "amount": 200 }, { "_id": 3, "order_id": "ORD003", "customer_id": 1, "product_id": 103, "amount": 300 }, { "_id": 4, "order_id": "ORD004", "customer_id": 3, "product_id": 101, "amount": 700 } ]

customers Collection:

json
[ { "_id": 1, "name": "John Doe", "email": "john@example.com" }, { "_id": 2, "name": "Jane Smith", "email": "jane@example.com" }, { "_id": 3, "name": "Mike Johnson", "email": "mike@example.com" } ]

products Collection:

json
[ { "_id": 101, "product_name": "Laptop", "price": 1000 }, { "_id": 102, "product_name": "Phone", "price": 500 }, { "_id": 103, "product_name": "Tablet", "price": 300 } ]

Joining the Collections

We want to create a report that includes orders along with the customer details and product details.

Aggregation Pipeline with Multiple $lookup Stages

Here’s how to use multiple $lookup stages to join the three collections:

javascript
db.orders.aggregate([ { $lookup: { from: "customers", // First join with the 'customers' collection localField: "customer_id", // Field from 'orders' collection foreignField: "_id", // Field from 'customers' collection as: "customer_info" // Name of the array field to add } }, { $unwind: "$customer_info" // Unwind the 'customer_info' array }, { $lookup: { from: "products", // Second join with the 'products' collection localField: "product_id", // Field from 'orders' collection foreignField: "_id", // Field from 'products' collection as: "product_info" // Name of the array field to add } }, { $unwind: "$product_info" // Unwind the 'product_info' array }, { $project: { order_id: 1, amount: 1, "customer_info.name": 1, "customer_info.email": 1, "product_info.product_name": 1, "product_info.price": 1 } } ])

Explanation:

  • First $lookup: Joins the orders collection with the customers collection using the customer_id field.
  • First $unwind: Deconstructs the resulting customer_info array so that each document contains a single customer object.
  • Second $lookup: Joins the resulting documents with the products collection using the product_id field.
  • Second $unwind: Deconstructs the resulting product_info array so that each document contains a single product object.
  • $project: Specifies the fields to include in the final output, creating a concise and readable result.

Result:

The result of the above aggregation will be:

json
[ { "_id": 1, "order_id": "ORD001", "amount": 500, "customer_info": { "name": "John Doe", "email": "john@example.com" }, "product_info": { "product_name": "Laptop", "price": 1000 } }, { "_id": 3, "order_id": "ORD003", "amount": 300, "customer_info": { "name": "John Doe", "email": "john@example.com" }, "product_info": { "product_name": "Tablet", "price": 300 } }, { "_id": 2, "order_id": "ORD002", "amount": 200, "customer_info": { "name": "Jane Smith", "email": "jane@example.com" }, "product_info": { "product_name": "Phone", "price": 500 } }, { "_id": 4, "order_id": "ORD004", "amount": 700, "customer_info": { "name": "Mike Johnson", "email": "mike@example.com" }, "product_info": { "product_name": "Laptop", "price": 1000 } } ]

In this result:

  • Each order document now includes the customer details (customer_info) and the product details (product_info).
  • The project stage ensures that only the relevant fields are included in the output.

Additional Tips:

  • Complex Joins: For more complex joins, you can use additional aggregation stages like $group, $match, or nested $lookup stages.
  • Optimizing Performance: Ensure that the fields used in joins are indexed to improve performance.
  • Pipeline Customization: Customize the pipeline stages to fit specific requirements, such as filtering, sorting, or additional calculations.

By using multiple $lookup stages in the aggregation pipeline, you can effectively join multiple collections and create comprehensive reports or views in MongoDB.