Skip to main content

Posts

Spring JDBC performing batch update example

There are many different variants of #batchUpdate methods available in JdbcTemplate . We will specifically look into those who uses BatchPreparedStatementSetter and ParameterizedPreparedStatementSetter . What is BatchPreparedStatementSetter? It is an interface used by JdbcTemplate to execute batch updates. This has methods to determine the batch size and method to set parameters in the PreparedStatement. Using this, JdbcTemplate will run only execute single batch based on the batch size returned by implementation this interface. How to use BatchPreparedStatementSetter? Let's create a ProductBatchPreparedStatementSetter which can set parameters in the statement. public class ProductBatchPreparedStatementSetter implements BatchPreparedStatementSetter { private final List products; public ProductBatchPreparedStatementSetter(List products) { Objects.requireNonNull(products); // Ideally you should do a defensive copy of this list. // this.products = ne

Spring JDBC PreparedStatementSetter example

What is PreparedStatementSetter? It is a callback interface used by JdbcTemplate after PreparedStatement is created to set the values in the statement object. How to use it? PreparedStatementSetter is also functional interface, so we will use lambda expression in this example to demonstrate PreparedStatementSetter's usage. We will use it in #update method of JdbcTemplate. int updateCount = jdbcTemplate.update("insert into product(name, category, description) values(?,?,?)", ps -> { ps.setString(1, "Lenovo Bag"); ps.setString(2, "bag"); ps.setString(3, "Handcrafted bags by Lenovo"); }); log.info(() -> String.format("Product inserted: %d", updateCount)); You can get the full code of this example from here .

Spring JDBC ResultSetExtractor example

What is ResultSetExtractor? It is an interface used by #query methods of JdbcTemplate . It is better suitable if you want to map one result object per ResultSet otherwise RowMapper is simpler choice to map one row of ResultSet with one object. How to use it? Let's first create a ResultSetExtractor which maps all the rows of ResultSet to single object. For this we will create a ProductResultSetExtractor which returns ProductResponse . public class ProductResultSetExtractor implements ResultSetExtractor { private final RowMapper productRowMapper; public ProductResultSetExtractor(RowMapper productRowMapper) { super(); this.productRowMapper = productRowMapper; } @Override public ProductResponse extractData(ResultSet rs) throws SQLException { final List products = new ArrayList<>(); int rowNum = 0; while(rs.next()) { products.add(productRowMapper.mapRow(rs, rowNum)); rowNum++; } return ProductResponse.of(prod

Spring JDBC RowMapper example

In this post, we will discuss what RowMapper is and how to use it when writing Jdbc code using Spring JDBC module. What is RowMapper? It is an interface of Spring JDBC module which is used by JdbcTemplate to map rows of java.sql.ResultSet . It is typically used when you query data. Example usage of RowMapper Let's first create a RowMapper which can map products. class ProductRowMapper implements RowMapper { @Override public Product mapRow(ResultSet rs, int rowNum) throws SQLException { Product product = new Product(); product.setId(rs.getInt("id")); product.setName(rs.getString("name")); product.setDescription(rs.getString("description")); product.setCategory(rs.getString("category")); return product; } } Now, we will use this ProductRowMapper in #queryForObject of JdbcTemplate . Product product = jdbcTemplate.queryForObject("select * from product where id=1", new Prod

Spring basic JdbcTemplate example

What is Spring JdbcTemplate? JdbcTemplate is the core class of Spring JDBC. It simplifies your interaction with low-level error prone details of JDBC access. You only pass the SQL statement to execute, parameters and processing logic for the returned data and rest is handled by it i.e. Opening Connection, transaction handling, error handling and closing Connection, Statement and Resultset. How to create object of JdbcTemplate? 1. Calling no args constructor. JdbcTemplate jdbcTemplate = new JdbcTemplate(); // You need to set datasource in later point in time and also have to call afterPropertiesSet. jdbcTemplate.setDataSource(DataSource ds); jdbcTemplate.afterPropertiesSet(); 2. By Calling constructor with datasource. JdbcTemplate jdbcTemplate = new JdbcTemplate(Datasource ds); 3. By Calling constructor with datasource and lazyInit parameter. JdbcTemplate jdbcTemplate = new JdbcTemplate(DataSource dataSource, boolean lazyInit); Querying with JdbcTemplate There are ma

Data Analytics: Watching and Alerting on real-time changing data in Elasticsearch using Kibana and SentiNL

In the previous post , we have setup ELK stack and ran data analytics on application events and logs. In this post, we will discuss how you can watch real-time application events that are being persisted in the Elasticsearch index and raise alerts if condition for watcher is breached using SentiNL (Kibana plugin). Few examples of alerting for application events ( see previous posts ) are: Same user logged in from different IP addresses. Different users logged in from same IP address. PermissionFailures in last 15 minutes. Particular kind of exception in last 15 minutes/ hour/ day. Watching and alerting on Elasticsearch index in Kibana There are many plugins available for watching and alerting on Elasticsearch index in Kibana e.g. X-Pack , SentiNL . X-Pack is a paid extension provided by elastic.co which provides security, alerting, monitoring, reporting and graph capabilities. SentiNL is free extension provided by siren.io which provides alerting and reporting function

Running data analytics on application events and logs using Elasticsearch, Logstash and Kibana

In this post, we will learn how to use Elasticsearch, Logstash and Kibana for running analytics on application events and logs. Firstly, I will install all these applications on my local machine. Installations You can read my previous posts on how to install Elasticsearch , Logstash , Kibana and Filebeat on your local machine. Basic configuration I hope by now you are have installed Elasticsearch, Logstash, Kibana and Filebeat on your system. Now, Let's do few basic configurations required to be able to run analytics on application events and logs. Elasticsearch Open elasticsearch.yml file in [ELASTICSEARCH_INSTLLATION_DIR]/config folder and add properties to it. cluster.name: gauravbytes-event-analyzer node.name: node-1 Cluster name is used by Elasticsearch node to form a cluster. Node name within cluster need to be unique. We are running only single instance of Elasticsearch on our local machine. But, in production grade setup there will be master nodes, data nodes a