Skip to main content

Posts

Showing posts from January, 2017

Apache Avro - RPC framework

This post is in continuation to my previous posts on Apache Avro - Introduction , Apache Avro - Generating classes from Schema and Apache Avro - Serialization . In this post, we will share insights on using Apache Avro as RPC framework. We first need to define a protocol to use Apache Avro as RPC framework. Before going into depth of this topic, let's discuss What protocol is? Avro protocols describes RPC interfaces. They are defined as JSON similar to Schema . A protocol has following attributes protocol : a string, defining name of the protocol. namespace : an optional that qualifies the name. types : an optional list of definitions of named types (like record, enum, fixed and errors). messages : an optional JSON object whose keys are method names of protocoland whose values are objects whose attributes are described below. No two messages may have the same name. Further, Message have following attributes request : a list of named, typed parameter schemas. respon

Java 8 - Lambda expressions

In this post, we will cover following topics. What are Lambda expressions? Syntax for Lambda expression. How to define no parameter Lambda expression? How to define single/ multi parameter Lambda expression? How to return value from Lambda expression? Accessing local variables in Lambda expression. Target typing in Lambda expression. What are Lambda expressions? Lambda expressions are the first step of Java towards functional programming. Lambda expressions enable us to treat functionality as method arguments, express instances of single-method classes more compactly. Syntax for Lambda expression Lambda has three parts: comma separated list of formal parameters enclosed in parenthesis. arrow token -> . and, body of expression (which may or may not return value). (param) -> { System.out.println(param); } Lambda expression can only be used where the type they are matched are functional interfaces . How to define no parameter Lambda expression? If the la

Java 8 - What are functional interfaces in Java?

Java 8 reincarnated SAM interfaces and termed them Functional interfaces. Functional interfaces have single abstract method and are eligible to be represented with Lambda expression . @FunctionalInterface annotation is introduced in Java 8 to mark an interface as functional. It ensures at compile-time that it has only single abstract method, otherwise it will throw compilation error. Let's define a functional interface. @FunctionalInterface public interface Spec<T> { boolean isSatisfiedBy(T t); } Functional interfaces can have default and static methods in them and still remains functional interface. @FunctionalInterface public interface Spec<T> { boolean isSatisfiedBy(T t); default Spec<T> not() { return (t) -> !isSatisfiedBy(t); } default Spec<T> and(Spec<T> other) { return (t) -> isSatisfiedBy(t) && other.isSatisfiedBy(t); } default Spec<T> or(Spec<T> other) { return (t) -> isSa

Comparing Java Default Serialization with Apache Avro serialization

I did a comparison of Java default serialization and Apache Avro serialization of data and results were very astonishing. You can read my older posts for Java serialization process and Apache Avro Serialization . Apache Avro consumed 15-20 times less memory to store the serialized data. I created a class with three fields (two String and one enum and serialized them with Avro and Java. The memory used by Avro is 14 bytes and Java used 231 bytes (length of byte[] ) Reason for generating less bytes by Avro Java Serialization The default serialization mechanism for an object writes the class of the object, the class signature, and the values of all non-transient and non-static fields. References to other objects (except in transient or static fields) cause those objects to be written also. Multiple references to a single object are encoded using a reference sharing mechanism so that graphs of objects can be restored to the same shape as when the original was written. Apac

Apache Avro - Serialization and Deserialization

This post is in continuation with my earlier posts on Apache Avro - Introduction and Apache Avro - Generating classes from Schema . In this post, we will discuss about reading (deserialization) and writing(serialization) of Avro generated classes. "Apache Avro™ is a data serialization system." We use DatumReader<T> and DatumWriter<T> for de-serialization and serialization of data, respectively. Apache Avro formats Apache Avro supports two formats, JSON and Binary . Let's move to an example using JSON format. Employee employee = Employee . newBuilder (). setFirstName ( "Gaurav" ). setLastName ( "Mazra" ). setSex ( SEX . MALE ). build (); DatumWriter < Employee > employeeWriter = new SpecificDatumWriter <>( Employee . class ); byte [] data ; try ( ByteArrayOutputStream baos = new ByteArrayOutputStream ()) { Encoder jsonEncoder = EncoderFactory . get (). jsonEncoder ( Employee . getClassSchema (), baos

Java 8 - Iterating collections and maps with #forEach, Consumer and BiConsumer

Iterating Collections API Java 8 introduced new way of iterating Collections API. It is retrofitted to support #forEach method which accepts Consumer in case of Collection and BiConsumer in case of Map . Consumer Java 8 added introduced new package java.util.function which also includes Consumer interface. It represents the operation which accepts one argument and returns no result. Before Java 8, you would have used for loop, extended for loop and/ or Iterator to iterate over Collections . List < Employee > employees = EmployeeStub . getEmployees(); Iterator < Employee > employeeItr = employees . iterator(); Employee employee; while (employeeItr . hasNext()) { employee = employeeItr . next(); System . out . println(employee); } In Java 8, you can write Consumer and pass the reference to #forEach method for performing operation on every item of Collection . // fetch employees from Stub List < Employee > employees = EmployeeStub . getEmplo

Java 8 - Filtering with Predicates

In this post, we will cover following items. What is java.util.function.Predicate? How to filter data with Predicates? Predicate chaining. Java 8 introduced many new features like Streaming API, Lambdas , Functional interfaces , default methods in interfaces and many more. Today, we will discuss about Predicate interface added in java.util.function package and its usage in filtering in-memory data. What is java.util.function.Predicate? Predicate is like a condition checker, which accepts one argument of type T and return the boolean value. It's a functional interface with functional method test(Object) . Here, Object is typed. @FunctionalInterface interface Predicate < T > { public boolean test ( T t ); } How we can filter data with Predicates? Consider we have Collection of employees and we want to filter them based on age, sex, salary and/ or with any other combinations. We can do that with Predicate . Let's understand this with one short

java.util.function package

java.util.function package Java 8 introduced new package and introduced many functional interface. It can be divided into four categories. Predicate Consumer Function Supplier Predicate It represents a boolean-valued function of one argument. It is a functional interface with method test(T) where T is typed. You can see the usage here . Consumer It represents an operation accept(s) argument(s) and return void with side-effects. Java 8 introduced many versions of Consumers. Consumer BiConsumer DoubleConsumer ObjIntConsumer You can see the usage of Consumer here .

Spring 4.3 - New @RequestMapping annotation

Spring 4.3 - @GetMapping, @PostMapping, @PutMapping and @DeleteMapping There are some new improvements in Spring Boot 1.4 and Spring 4.3 which lead to a better readability and some use of annotations, particularly with HTTP request methods. We usually map GET, PUT, POST and DELETE HTTP method in rest controller in the following way. @RestController @RequestMapping ( "/api/employees" ) public class EmployeeController { @RequestMapping public ResponseEntity < List < Employee >> getAll () { return ResponseEntity . ok ( Collections . emptyList ()); } @RequestMapping ( "/{employeeId}" ) public ResponseEntity < Employee > findById ( @PathVariable Long employeeId ) { return ResponseEntity . ok ( EmployeeStub . findById ( employeeId )); } @RequestMapping ( method = RequestMethod . POST ) public ResponseEntity < Employee > addEmployee ( @RequestBody Employee employee ) { return Respons

Single Responsibility Principle - Improved Example

This post is in continuation to my older post on Single Responsibility principle . At that time, I provided solution where we refactored FileParser and moved validation logic to FileValidationUtils and also composed Parser interface with various implementation viz. CSVFileParser , XMLFileParser and JsonFileParser (A sort of Strategy Design pattern ). and validation code was moved to FileValidationUtils java file . You can get hold of old code on Github . This was roughly 2 years ago :). I though of improving this code further. We can completely remove FileValidationUtils by making following code change in Parser interface. public interface Parser { public void parse ( File file ); public FileType getFileType (); public default boolean canParse ( File file ) { return Objects . nonNull ( file ) && file . getName (). endsWith ( getFileType (). getExtension ()); } } public class FileParser { private Parser parser ; public Fi

Apache Avro - Generating classes from Schema

This post is in continuation to my previous post on Apache Avro - Introduction . In this post, we will discuss about generating classes from Schema. How to create Apache Avro schema? There are two ways to generate AVRO classes from Schema. Pragmatically generating schema Using maven Avro plugin Consider we have following schema in " src/main/avro " { "type" : "record" , "name" : "Employee" , "namespace" : "com.gauravbytes.avro" , "doc" : "Schema to hold employee object" , "fields" : [{ "name" : "firstName" , "type" : "string" }, { "name" : "lastName" , "type" : "string" }, { "name" : "sex" , "type" : { "name" : "SEX" , "type" : "enum" , "symbols"

Apache Avro - Introduction

In this post, we will discuss following items What is Apache Avro? What is Avro schema and how to define it? Serialization in Apache Avro. What is Apache Avro ? " Apache Avro is data serialization library " That's it, huh. This is what you will see when you open their official page. Apache Avro is: Schema based data serialization library. RPC framework (support). Rich data structures (Primary includes null, string, number, boolean and Complex includes Record, Array, Map etc.). A compact, fast and binary data format. What is Avro schema and how to define it? Apache Avro serialization concept is based on Schema. When you write data, schema is written along with it. When you read data, schema will always be present. The schema along with data makes it fully self describing. Schema is representation of AVRO datum(Record). It is of two types: Primitive and Complex . Primitive types These are the basic type supported by Avro. It includes null, int, long, byte