https://howtodoinjava.com/spring-webflux/webclient-get-post-example/
Wednesday, 18 October 2023
Tuesday, 17 October 2023
Java email send issue
https://stackoverflow.com/questions/21856211/javax-activation-unsupporteddatatypeexception-no-object-dch-for-mime-type-multi/21898970#21898970
Thread.currentThread().setContextClassLoader( getClass().getClassLoader() );
Tuesday, 12 September 2023
stream tutorial
https://stackabuse.com/search/?q=stream
https://www.logicbig.com/how-to/code-snippets/jcode-java-8-streams-collectors-groupingby.html
Tuesday, 22 August 2023
java8 collectors group
Grouping collectors are part of the Java Stream API's Collectors
utility class. They allow you to group elements from a stream based on a specific criterion and perform various aggregation operations on the grouped elements. These collectors are particularly useful for creating maps or other data structures that organize data into groups.
Here are some commonly used grouping collectors:
Collectors.groupingBy(): This collector groups elements based on a classifier function and creates a map where the keys are the results of the classifier function and the values are lists of elements that match each key.
Example:
java
Map<String, List<Person>> peopleByCity = people.stream() .collect(Collectors.groupingBy(Person::getCity));
Collectors.groupingByConcurrent():
Similar to groupingBy()
, this collector groups elements concurrently, making it suitable for parallel processing.
Collectors.partitioningBy(): This collector divides elements into two groups (true and false) based on a predicate and creates a map with Boolean keys.
Example:
java
Map<Boolean, List<Person>> adultsAndMinors = people.stream()
.collect(Collectors.partitioningBy(p -> p.getAge() >= 18));
Collectors.toMap():
While not a grouping collector in the traditional sense, toMap()
can be used to group elements based on keys and values derived from the elements.
Example:
java
Map<String, Integer> nameToAge = people.stream() .collect(Collectors.toMap(Person::getName, Person::getAge));
Collectors.groupingBy with downstream collectors:
You can combine grouping collectors with downstream collectors like Collectors.counting()
, Collectors.summingInt()
, Collectors.mapping()
, etc., to perform more complex aggregations on the grouped elements.
Example:
java
Map<String, Long> countByCity = people.stream() .collect(Collectors.groupingBy(Person::getCity, Collectors.counting()));
These grouping collectors help you efficiently organize and analyze data based on specific criteria, making complex data manipulations more manageable. They are particularly useful when working with large datasets and performing aggregations on grouped data.
java8 classic code optimization example
public boolean changeEmployeeProject(ChangeEmployeeProjectPayload changeEmployeeProjectPayload) {
List<EmployeesProjectRowMapper> employeesProjectRowMappers = employeeDao.fetchEmployeeProjectAssignment(changeEmployeeProjectPayload.getEmployee_ids());
if (!employeesProjectRowMappers.isEmpty()) {
Map<Integer, List<EmployeesProjectRowMapper>> employeeWithProjects = employeesProjectRowMappers.stream().collect(Collectors.groupingBy(e -> e.getEmployee_id()));
Map<Integer, List<Integer>> employeeWithProject = new HashMap<>();
employeeWithProjects.entrySet().stream().forEach(emp -> {
employeeWithProject.put(emp.getKey(), emp.getValue().stream().map(e -> e.getProject_id()).collect(Collectors.toList()));
});
System.out.println("code---" + employeeWithProjects);
System.out.println("pawan----" + employeeWithProject);
}
if (!employeesProjectRowMappers.isEmpty()) {
Map<Integer, List<Integer>> employeeWithProject = employeesProjectRowMappers.stream()
.collect(Collectors.groupingBy(
EmployeesProjectRowMapper::getEmployee_id,
Collectors.mapping(EmployeesProjectRowMapper::getProject_id, Collectors.toList())
));
System.out.println("employeeWithProjects: " + employeeWithProject);
}
return true;
}
Friday, 7 July 2023
Monday, 3 July 2023
Tuesday, 27 June 2023
java8 stream tutorial
https://www.logicbig.com/how-to/code-snippets/jcode-java-8-streams-stream-allmatch.html
Monday, 26 June 2023
Friday, 28 April 2023
Thursday, 27 April 2023
Redis template
https://www.tabnine.com/code/java/methods/org.springframework.data.redis.core.RedisTemplate/opsForSet
Wednesday, 5 April 2023
Exception handling ways in springboot
https://springframework.guru/exception-handling-in-spring-boot-rest-api/
Monday, 27 March 2023
Thursday, 23 March 2023
spring batch example
steps for create spring batch application
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-batch</artifactId>
</dependency>
public class MyFirstJob implements Tasklet {
@Override
public RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext) throws Exception {
System.out.println("MyTaskOne start..");
List<Integer> mylist= Arrays.asList(1,2,3,4,5,6,6,67,7,7,7,7);
for(Integer number :mylist){
System.out.println("-------------"+number);
}
// ... your code
System.out.println("MyTaskOne done..");
return RepeatStatus.FINISHED;
}
}
public class MySecondJob implements Tasklet {
@Override
public RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext) throws Exception {
List<String>mylist= Arrays.asList("pawan","ravi","mynme");
for(String name: mylist){
System.out.println("name---"+name);
}
return RepeatStatus.FINISHED;
}
}
@Configuration
@EnableBatchProcessing
public class BatchConfig {
@Autowired
private JobBuilderFactory jobs;
@Autowired
private StepBuilderFactory steps;
@Bean
public Step stepOne() {
return steps.get("stepOne")
.tasklet(new MyFirstJob())
.build();
}
@Bean
public Step stepTwo() {
return steps.get("stepTwo")
.tasklet(new MySecondJob())
.build();
}
@Bean
public Job demoJob() {
return jobs.get("demoJob").incrementer(new RunIdIncrementer())
.start(stepOne())
.next(stepTwo())
.build();
}
}
@Configuration
public class AvoidMetadataConfiguration extends DefaultBatchConfigurer {
@Override
protected JobRepository createJobRepository() throws Exception {
MapJobRepositoryFactoryBean factoryBean = new MapJobRepositoryFactoryBean();
factoryBean.afterPropertiesSet();
return factoryBean.getObject();
}
}
@Component
public class SpringJobSchedular {
@Autowired
JobLauncher jobLauncher;
@Autowired
Job job;
@Scheduled(cron = "* 0/1 * * * ?")
public void perform() throws Exception
{
System.out.println("------------------myjob in ----------------------");
JobParameters params = new JobParametersBuilder()
.addString("JobID", String.valueOf(System.currentTimeMillis()))
.toJobParameters();
jobLauncher.run(job, params);
System.out.println("------------------myjob in ----------------------");
}
}
Creating docker image from fresh
1. Create docker file
Dockerfile
FROM openjdk
COPY ./target/myservice-0.0.1-SNAPSHOT.jar /usr/app/
WORKDIR /usr/app
RUN sh -c 'touch myservice-0.0.1-SNAPSHOT.jar'
EXPOSE 8089
ENV spring.profiles.active="local"
2.Create image from docker file
sudo docker build -t mylocalapp:1.0 .
mylocalapp is image name
1.0 is tag
3.Create container from image
sudo docker run -d -p 9090:80 --name pawancontainer mylocalapp:1.0
pawancontainer is container name
mylocalapp:1.0 is image name and 1.0 is tag name
4.sudo docker ps -a
list of all container
Monday, 20 March 2023
Create a docker image
https://devopscube.com/build-docker-image/
FROM ubuntu:18.04
LABEL maintainer="contact@devopscube.com"
RUN apt-get -y update && apt-get -y install nginx
COPY files/default /etc/nginx/sites-available/default
COPY files/index.html /usr/share/nginx/html/index.html
EXPOSE 80
CMD ["/usr/sbin/nginx", "-g", "daemon off;"]
Sunday, 19 March 2023
elastic search setup and query
https://reflectoring.io/spring-boot-elasticsearch
docker run -p 9200:9200 \
-e "discovery.type=single-node" \
docker.elastic.co/elasticsearch/elasticsearch:7.10.0
https://www.elastic.co/guide/en/elasticsearch/client/java-api-client/current/connecting.html
Documentation for elastic search
Elastic search code implementation
import co.elastic.clients.elasticsearch.ElasticsearchClient;
import co.elastic.clients.json.jackson.JacksonJsonpMapper;
import co.elastic.clients.transport.ElasticsearchTransport;
import co.elastic.clients.transport.rest_client.RestClientTransport;
import lombok.Builder;
import org.apache.http.HttpHost;
import org.elasticsearch.client.RestClient;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
@Builder
@Configuration
public class ElasticSearchConfig {
@Bean
public RestClient getRestClient() {
Header[] defaultHeaders = new Header[]{new BasicHeader("Content-type", "application/json"),new BasicHeader("X-Elastic-Product", "Elasticsearch")};
RestClient restClient = RestClient.builder(
new HttpHost("localhost", 9200)).setDefaultHeaders(defaultHeaders).build();
return restClient;
}
@Bean
public ElasticsearchTransport getElasticsearchTransport() {
return new RestClientTransport(
getRestClient(), new JacksonJsonpMapper());
}
@Bean
public ElasticsearchClient getElasticsearchClient() {
ElasticsearchClient client = new ElasticsearchClient(getElasticsearchTransport());
return client;
}
}
mport co.elastic.clients.elasticsearch.ElasticsearchClient;
import co.elastic.clients.elasticsearch.core.*;
import co.elastic.clients.elasticsearch.core.search.Hit;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Repository;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import java.util.Objects;
@Repository
public class ElasticSearchQuery {
@Autowired
private ElasticsearchClient elasticsearchClient;
private final String indexName = "messages";
public String createOrUpdateDocument(Product product) throws IOException {
IndexResponse response = elasticsearchClient.index(i -> i
.index(indexName)
.id(product.getId())
.document(product)
);
if (response.result().name().equals("Created")) {
return new StringBuilder("Document has been successfully created.").toString();
} else if (response.result().name().equals("Updated")) {
return new StringBuilder("Document has been successfully updated.").toString();
}
return new StringBuilder("Error while performing the operation.").toString();
}
public Product getDocumentById(String productId) throws IOException {
Product product = null;
GetResponse<Product> response = elasticsearchClient.get(g -> g
.index(indexName)
.id(productId),
Product.class
);
if (response.found()) {
product = response.source();
System.out.println("Product name " + product.getName());
} else {
System.out.println("Product not found");
}
return product;
}
public String deleteDocumentById(String productId) throws IOException {
DeleteRequest request = DeleteRequest.of(d -> d.index(indexName).id(productId));
DeleteResponse deleteResponse = elasticsearchClient.delete(request);
if (Objects.nonNull(deleteResponse.result()) && !deleteResponse.result().name().equals("NotFound")) {
return new StringBuilder("Product with id " + deleteResponse.id() + " has been deleted.").toString();
}
System.out.println("Product not found");
return new StringBuilder("Product with id " + deleteResponse.id() + " does not exist.").toString();
}
public List<Product> searchAllDocuments() throws IOException {
SearchRequest searchRequest = SearchRequest.of(s -> s.index(indexName));
SearchResponse searchResponse = elasticsearchClient.search(searchRequest, Product.class);
List<Hit> hits = searchResponse.hits().hits();
List<Product> products = new ArrayList<>();
for (Hit object : hits) {
System.out.print(((Product) object.source()));
products.add((Product) object.source());
}
return products;
}
}
Dependency required
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-client</artifactId>
<version>8.6.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.springframework.data/spring-data-elasticsearch -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-elasticsearch</artifactId>
</dependency>
<dependency>
<groupId>co.elastic.clients</groupId>
<artifactId>elasticsearch-java</artifactId>
<version>8.6.2</version>
</dependency>
Header[] defaultHeaders = new Header[]{new BasicHeader("Content-type", "application/json"),new BasicHeader("X-Elastic-Product", "Elasticsearch")};
RestClient restClient = RestClient.builder(
new HttpHost("localhost", 9200)).setDefaultHeaders(defaultHeaders).build();
return restClient;
Thursday, 16 March 2023
Spring AOP for logging
@Aspect
@Component
public class LogMyMethodAspect {
private static final Logger logger = LoggerFactory.getLogger(LogMyMethodAspect.class);
@Around("@annotation(LogMyMethod)")
public void logMyMethod(ProceedingJoinPoint joinPointpawan) throws Throwable {
System.out.println("-----------------pawan----------------");
logger.info("service_identifier={},event={},game_session_id={},company_id={},player_id={},game_id={},action={},reason={},status={},data={}",
"CDS",
joinPointpawan.getSignature().getName(),
"",
"",
"",
"",
"refusing to start downstream processing",
"gameSessionID already processed",
"failed",
Arrays.stream(joinPointpawan.getArgs()).findFirst()
);
logger.info("service_identifier={},event={},game_session_id={},company_id={},player_id={},game_id={},action={},reason={},status={},data={}",
"CDS",
"submitEndGameDetails",
"",
"",
"",
"",
"refusing to start downstream processing",
"gameSessionID already processed",
"completed",
Arrays.stream(joinPointpawan.getArgs()).findFirst()
);
joinPointpawan.proceed();
}
@Pointcut(value = "execution(* com.huddle.gameservice.service.impl.GameProfileServiceImpl.*(..))")
private void logDisplay()
{
System.out.println("---------------controller method called--------------------");
}
@Around("execution(* com.huddle.gameservice.service.impl.GameProfileServiceImpl.*(..))")
public void
logBefore(ProceedingJoinPoint joinPointpawan)
{
System.out.println(
".............I WILL EXECUTE BEFORE EACH AND EVERY METHOD.............");
logger.info("service_identifier={},event={},game_session_id={},company_id={},player_id={},game_id={},action={},reason={},status={},data={}",
(Object) "CDS",
joinPointpawan.getSignature().getName(),
"",
"",
"",
"",
"refusing to start downstream processing",
"gameSessionID already processed",
"failed",
Arrays.stream(joinPointpawan.getArgs()).collect(Collectors.toList()).toString()
);
}
@Around(value ="execution(* com.huddle.gameservice.controller.GameProfileController.*(..))")
public Object logBeforeController(ProceedingJoinPoint joinPointpawan) throws Throwable {
System.out.println(".............Logging controller request.............");
logger.info("service_identifier={},event={},game_session_id={},company_id={},player_id={},game_id={},action={},reason={},status={},data={}",
(Object) "CDS",
joinPointpawan.getSignature().getName(),
"",
"",
"",
"",
"refusing to start downstream processing",
"gameSessionID already processed",
"failed",
Arrays.stream(joinPointpawan.getArgs()).collect(Collectors.toList()).toString()
);
return joinPointpawan.proceed();
}
Tuesday, 14 March 2023
Create own annotation in springboot
https://fullstackdeveloper.guru/2021/06/15/how-to-create-a-custom-annotation-in-spring-boot/
@Target(ElementType.METHOD)
@Retention(RetentionPolicy.RUNTIME)
public @interface LogMyMethodAnnotation {
}
@Aspect
@Component
public class LogMyMethodAnnotationAspect {
@Around("@annotation(LogMyMethodAnnotation)")
public void logMyController(ProceedingJoinPoint joinPoint){
System.out.println("pawankumar--------");
System.out.println("joinPoint"+joinPoint.getArgs().toString());
}
}
@LogMyMethodAnnotation
@RequestMapping(method = RequestMethod.GET, value = "game/get_game_profile_data")
Friday, 10 March 2023
how to install docker in linux
1. Open the terminal on Ubuntu.
2. Remove any Docker files that are running in the system, using the following command:
$ sudo apt-get remove docker docker-engine docker.io |
After entering the above command, you will need to enter the password of the root and press enter.
3. Check if the system is up-to-date using the following command:
$ sudo apt-get update |
4. Install Docker using the following command:
$ sudo apt install docker.io |
You’ll then get a prompt asking you to choose between y/n - choose y

5. Install all the dependency packages using the following command:
$ sudo snap install docker |
6. Before testing Docker, check the version installed using the following command:
$ docker --version |
7. Pull an image from the Docker hub using the following command:
$ sudo docker run hello-world |
Here, hello-world is the docker image present on the Docker hub.
8. Check if the docker image has been pulled and is present in your system using the following command:
$ sudo docker images |
9. To display all the containers pulled, use the following command:
$ sudo docker ps -a |
10. To check for containers in a running state, use the following command:
$ sudo docker ps |
Monday, 6 March 2023
Redis Stream
1.Create:- XADD mystream * mydata '{"name":"pawan"}'
2.Read:- XREAD COUNT 100 STREAMS mystream 0
3.Lenth:- XLEN mystream
4.List data with rang:- XRANGE mystream - +
Sunday, 5 March 2023
Rest client
https://howtodoinjava.com/spring-webflux/webclient-get-post-example/
https://www.baeldung.com/java-reactor-flux-vs-mono
https://www.callicoder.com/spring-5-reactive-webclient-webtestclient-examples/
Wednesday, 1 March 2023
Friday, 24 February 2023
Factory design pattern
public class CSVFactory {
public static void main(String arg[]) {
// GenerateCSVFile filetype= CSVFactoryClass
CSVFactoryClass factorydata = new CSVFactoryClass();
factorydata.callCSVfactory("detail").generateCSVFile();
CompanyWideReportCSV abc= new CompanyWideReportCSV();
abc= (CompanyWideReportCSV) factorydata.callCSVfactory("company");
abc.generateCSVFile();
}
}
interface GenerateCSVFile {
public String generateCSVFile();
}
class DetailReportCSV implements GenerateCSVFile{
public String generateCSVFile(){
System.out.println("pawan kumar------");
return "pawan";
}
public String generateCSVFilea(){
System.out.println("pawan kumar1111111111----------");
return "pawan";
}
}
class CompanyWideReportCSV implements GenerateCSVFile{
public String generateCSVFile(){
System.out.println("----------kumar----------");
return "kumar";
}
}
class CSVFactoryClass {
public GenerateCSVFile callCSVfactory(String csvType) {
if(csvType.equalsIgnoreCase("detail")){
return new DetailReportCSV();
}else if(csvType.equalsIgnoreCase("company")){
return new CompanyWideReportCSV();
} else {
return new DetailReportCSV();
}
}
}
Friday, 17 February 2023
Stream pratical example
public static void main (String arg[]) {
// Stream.of("a2", "a1", "b1", "b3", "c2")
// .filter(s -> {
// System.out.println("filter: " + s);
// //return true;
// if(s.startsWith("a")){
// return true;
// }
// return false;
// })
// .forEach(s -> System.out.println("forEach: " + s));
// SimpleDateFormat formatter = new SimpleDateFormat("dd/MM/yyyy HH:mm:ss");
// Date date = new Date();
// long startTime = Instant.now().toEpochMilli();
// log.info("shop-game-usage-report_scheduler Start time" + startTime);
// Stream.of("ananas", "oranges", "apple", "pear", "banana")
// .map(String::toUpperCase) // 1. Process
// .sorted() // 2. Sort
// .filter(s -> s.startsWith("A")) // 3. Reject
// .forEach(System.out::println);
// long totalTimeTaken = Instant.now().toEpochMilli() - startTime;
// log.info("shop-game-usage-report_scheduler time taken" + totalTimeTaken);
// System.out.println("-----------------------------------------------");
// log.info("shop-game-usage-report_scheduler Start time" + startTime);
// Stream.of("ananas", "oranges", "apple", "pear", "banana").filter(s->s.startsWith("a")).map(s->s.toUpperCase()).sorted((l,r)->l.compareTo(r)).forEach(s-> System.out.println(s));
// log.info("shop-game-usage-report_scheduler time taken" + totalTimeTaken);
// Stream.of(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16).filter(number->number%2==0).forEach(evenNumber-> System.out.println(evenNumber));
List<Integer> number= Arrays.asList(1,2,3,4,5,6,7,8,9,10,1,2,3,4,5,6,7,7,8,9);
//Set<Integer> duplicate= new HashSet<>();
//Set<Integer>ghy=number.stream().filter(n->!duplicate.add(n)).collect(Collectors.toSet());
//System.out.println(ghy);
//System.out.println(number.stream().findFirst().get());
//System.out.println(number.stream().findAny().get());
//System.out.println(number.stream().count());
// List<Integer> myList = Arrays.asList(10,15,8,49,25,98,98,32,15);
// Set<Integer> set = new HashSet();
// myList.stream()
// .filter(n -> !set.add(n))
// .forEach(System.out::println);
// System.out.println(number.stream().max(Integer::compare).get());
// System.out.println(number.stream().min(Integer::compare).get());
// Map<String,String> charkey= new HashMap();
// String name="pawankumar";
// name.chars().mapToObj(c->(char)c).forEach(c->{
// charkey.put(c.toString(),c.toString());
// });
// String input = "Java Hungry jBlog Alive is Awesome";
//
// Character result = input.chars() // Stream of String
// .mapToObj(s -> Character.toLowerCase(Character.valueOf((char) s))) // First convert to Character object and then to lowercase
// .collect(Collectors.groupingBy(Function.identity(), LinkedHashMap::new, Collectors.counting())) //Store the chars in map with count
// .entrySet()
// .stream()
// .filter(entry -> entry.getValue() == 1L)
// .map(entry -> entry.getKey())
// .findFirst()
// .get();
// System.out.println(result);
// List<Integer> myList = Arrays.asList(10,15,8,49,25,98,32);
// myList.stream()
// .map(s -> s + "") // Convert integer to String
// .forEach(System.out::println);
// List<Integer> myList = Arrays.asList(10,15,8,49,25,98,98,32,15);
// myList.stream().collect(Collectors.toSet()).stream().sorted(Collectors.reverseOrder()).forEach(s-> System.out.println(s));
// myList.stream()
// .collect(Collectors.toSet()).stream().sorted()
// .forEach(System.out::println);
class Student {
// Instance Variables
// Properties of a student
String rollNo;
String name;
// Constructor
Student(String rollNo, String name) {
this.rollNo = rollNo;
this.name = name;
}
// To test the equality of two Student objects
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
Student student = (Student) o;
return rollNo.equals(student.rollNo);
}
@Override
public int hashCode() {
return Objects.hash(rollNo);
}
}
// Create three Student objects
Student s1 = new Student("pawan", "Ram");
Student s2 = new Student("kumar", "Shyam");
Student s3 = new Student("pawan", "pawanrt");
System.out.println("-----"+s1.hashCode());
System.out.println("-----"+s2.hashCode());
System.out.println("-----"+s3.hashCode());
// Create a HashSet to store Student instances
HashSet<Student> studentData = new HashSet<Student>();
// Adding Student objects
studentData.add(s1);
// s2 will NOT be inserted because it has the same hashcode as s1
// and it satisfies the condition of equals() method with s1
// i.e. s1.equals(s2) == true
studentData.add(s2);
// s3 will be inserted as it has different hashcode than s1
studentData.add(s3);
// Print the elements of studentData HashSet
for (Student s : studentData) {
System.out.println(s.rollNo + " " + s.name);
}
}