Friday, November 30, 2018

ElasticSearch - How to retrieve Spring Boot logstash using the Elasticsearch API

Introduction

Elasticsearch is a highly scalable open-source full-text search and analytics engine. It allows you to store, search, and analyze big volumes of data quickly and in near real time. It is generally used as the underlying engine/technology that powers applications that have complex search features and requirement.


Basic Concepts

  • Cluster

edit

A cluster is a collection of one or more nodes (servers) that together holds your entire data and provides federated indexing and search capabilities across all nodes. A cluster is identified by a unique name which by default is "elasticsearch". This name is important because a node can only be part of a cluster if the node is set up to join the cluster by its name.

  • Node

edit

A node is a single server that is part of your cluster, stores your data, and participates in the cluster’s indexing and search capabilities. Just like a cluster, a node is identified by a name which by default is a random Universally Unique IDentifier (UUID) that is assigned to the node at startup. You can define any node name you want if you do not want the default. This name is important for administration purposes where you want to identify which servers in your network correspond to which nodes in your Elasticsearch cluster

  • Index

edit

An index is a collection of documents that have somewhat similar characteristics. For example, you can have an index for customer data, another index for a product catalog, and yet another index for order data. An index is identified by a name (that must be all lowercase) and this name is used to refer to the index when performing indexing, search, update, and delete operations against the documents in it.
Exploring Your Cluster

  • Cluster health
 curl -X GET "xx.xx.xx.xx:9200/_cat/health?v"
 curl -X GET "xx.xx.xx.xx:9200/_cat/nodes?v"
 curl -X GET "xx.xx.xx.xx:9200/_cat/indices?v"


  • Search
Generic search
curl -X GET "xx.xx.x1:9200/logstash-2018.11.26/_search" -H 'Content-Type: application/json' -d'
{
  "query": { "match_all": {} },
  "sort": [
    { "@timestamp": "desc" }
  ]
}
'

Specific search
curl -X GET "xx.xx:9200/logstash-2018.11.26/_search" -H 'Content-Type: application/json' -d'
{
  "query": {"match": { "provisioning_id": 122167316 } },
  "sort": [
    { "@timestamp": "desc" }
  ]
}
'

"took" : 3,
  "timed_out" : false,
  "_shards" : {
    "total" : 5,
    "successful" : 5,
    "failed" : 0
  },
  "hits" : {
    "total" : 62408,
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "logstash-2018.11.26",
        "_type" : "logs",
        "_id" : "AWdNUNFKTDBldbR7TNTf",
        "_score" : 1.0,
        "_source" : {
          "hostname" : "3d63948ce82e",
          "host_ip" : "xx.xx.xx.xx",
          "@timestamp" : "2018-11-26T00:00:00.001Z",
          "application" : "stats",
          "level" : "INFO",
          "port" : 47166,
          "thread_name" : "ThreadPoolTaskScheduler1",
          "@version" : 1,
          "host" : "xx.xx.xx.xx",
          "logger_name" : "ci.stats.cron.Scheduler",
          "message" : "Populating dashboard caches from 2018-11-24T00:00Z[UTC] to 2018-11-25T23:59:59.999Z[UTC] by HOUR",
          "user" : "root"
        }
      },

As for the response, we see the following parts:
  • took – time in milliseconds for Elasticsearch to execute the search
  • timed_out – tells us if the search timed out or not
  • _shards – tells us how many shards were searched, as well as a count of the successful/failed searched shards
  • hits – search results
  • hits.total – total number of documents matching our search criteria
  • hits.hits – actual array of search results (defaults to first 10 documents)
  • hits.sort - sort key for results (missing if sorting by score)
  • hits._score and max_score - ignore these fields for now


  • REST API

package ci.account.logging;

import org.elasticsearch.client.Client;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.transport.InetSocketTransportAddress;
import org.elasticsearch.transport.client.PreBuiltTransportClient;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.data.elasticsearch.core.ElasticsearchOperations;
import org.springframework.data.elasticsearch.core.ElasticsearchTemplate;
import org.springframework.data.elasticsearch.repository.config.EnableElasticsearchRepositories;
import org.springframework.context.annotation.Configuration;

import java.net.InetAddress;
import java.net.UnknownHostException;


@Configuration
@EnableElasticsearchRepositories(basePackages = "com.ci.account")
public class ElasticSearchConfig {

  @Value("${elasticsearch.home:/var/lib/elasticsearch}")
  private String elasticsearchHome;

  @Value("${elasticsearch.cluster.name:elasticsearch}")
  private String clusterName;

  @Bean
  public Client client() throws UnknownHostException {
    Settings elasticsearchSettings = Settings.builder()
        .put("cluster.name", clusterName).build();
    TransportClient client = new PreBuiltTransportClient(elasticsearchSettings);
    client.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("myhost"), 9300));

    return client;
  }

  @Bean
  public ElasticsearchOperations elasticsearchTemplate() throws UnknownHostException {
    return new ElasticsearchTemplate(client());
  }
}


package ci.account.logging;

import com.fasterxml.jackson.annotation.JsonFormat;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.databind.annotation.JsonPOJOBuilder;
import org.springframework.data.elasticsearch.annotations.DateFormat;
import org.springframework.data.elasticsearch.annotations.Document;

import org.springframework.data.annotation.Id;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;

import java.util.Date;

@lombok.Value
@lombok.Builder(toBuilder = true, builderClassName = "Builder")
@lombok.NoArgsConstructor(force = true, access = lombok.AccessLevel.PRIVATE)
@lombok.AllArgsConstructor(access = lombok.AccessLevel.PRIVATE)

@JsonInclude(JsonInclude.Include.NON_EMPTY)
@Document(indexName = "logstash*", type = "logs")

public class Log {
  @Id
  String _id;
  String provisioning_id;
  String message;
  String application;
  String level;
  String _type;
  String _index;

  @JsonFormat(shape = JsonFormat.Shape.STRING, pattern ="yyyy-MM-dd'T'HH:mm:ss.SSSZZ")
//  @Field(type = FieldType.Date, format = DateFormat.custom, pattern = "dd-MM-yyyy HH:mm:ss.SSS")
  @Field(type = FieldType.Date, format = DateFormat.custom, pattern = "yyyy-MM-dd'T'HH:mm:ss.SSSZZ")

  @JsonProperty(value = "@timestamp")
  Date timestamp;
  String host;

  @JsonPOJOBuilder(withPrefix = "")
  public static class Builder {
  }
}


package ci.account.persistence;


import ci.account.logging.Log;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
import org.springframework.data.elasticsearch.annotations.Query;
import org.springframework.data.elasticsearch.repository.ElasticsearchRepository;
import org.springframework.stereotype.Repository;

import javax.inject.Named;
import java.awt.print.Book;
import java.util.List;


@Named
public interface LogRepository extends ElasticsearchRepository<Log, String> {
  @Query("{\"bool\": {\"must\": [{\"match\": {\"provisioning_id\": \"?0\"}}]}}")
  List<Log> findByProvisioning_id(String id);
}

Format the log to include ANSI Color code
package ci.account.service;

import logging.Log;
import ci.account.persistence.LogRepository;
import ci.account.model.Service;
import ci.account.model.exceptions.NotFoundException;
import ci.account.model.registry.ServiceEvent;
import ci.account.persistence.registry.ServiceEventDao;
import ci.security.Principal;

import javax.inject.Inject;
import javax.inject.Named;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;


@Named
public class EventService {

  private final ServiceEventDao eventRepo;
  private final RegistryService registryService;
  private final LogRepository logRepository;
  private String color;
  public static final String ANSI_RESET = "\u001B[0m";
  public static final String ANSI_BLACK = "\u001B[30m";
  public static final String ANSI_RED = "\u001B[31m";
  public static final String ANSI_GREEN = "\u001B[32m";
  public static final String ANSI_YELLOW = "\u001B[33m";
  public static final String ANSI_BLUE = "\u001B[34m";
  public static final String ANSI_PURPLE = "\u001B[35m";
  public static final String ANSI_CYAN = "\u001B[36m";
  public static final String ANSI_WHITE = "\u001B[37m";

  @Inject
  public EventService(ServiceEventDao eventRepo, RegistryService registryService, LogRepository logRepository) {
    this.eventRepo = eventRepo;
    this.registryService = registryService;
    this.logRepository = logRepository;
  }


  public List<ServiceEvent> list(Principal principal, Long customerId, Long teamId, String type, String location, Long serviceId) {
    Service service = registryService.get(principal, customerId, teamId, type, location, serviceId);
    List<ServiceEvent> serviceEvents = eventRepo.findByObjectIdOrderByIdAsc(serviceId);
    List<ServiceEvent> LogEvents = new ArrayList<ServiceEvent>();

    serviceEvents.forEach(event -> {
      List<Log> logs = logRepository.findByProvisioning_id(event.getId().toString());
      String formatedLogs = formatElasticSearchLog(logs);

      ServiceEvent e = event.copyBuilder().withLog(formatedLogs).build();
      LogEvents.add(e);
    });
    return LogEvents;
  }

  private String formatElasticSearchLog(List<Log> logs) {
    StringBuilder sb = new StringBuilder();
    logs.forEach(log -> {
      switch(log.getLevel()) {
        case "FATAL":
          color = ANSI_RED;
          break;
        case "ERROR":
          color = ANSI_RED;
          break;
        case "WARN":
          color = ANSI_YELLOW;
          break;
        case "INFO":
          color = ANSI_GREEN;
          break;
        case "DEBUG":
          color = ANSI_CYAN;
          break;
        case "TRACE":
          color = ANSI_BLACK;
          break;
        default:
          color = ANSI_RESET;
      }

      sb.append(log.getTimestamp() + " ");
      sb.append(ANSI_PURPLE);
      sb.append(log.getThread_name() + " ");
      sb.append(color.toString());
      sb.append(log.getLevel() + " ");
      sb.append(log.getLogger_name() + " - ");
      sb.append(log.getProvisioning_id() + " - ");
      sb.append(log.getMessage() + " " + ANSI_RESET);
      sb.append(System.lineSeparator());
    });
    return sb.toString();
  }

  public ServiceEvent getEvent(Principal principal, Long customerId, Long teamId, String type, String location, Long serviceId, Long eventId) {
    registryService.get(principal, customerId, teamId, type, location, serviceId);
    Optional<ServiceEvent> optEvent = eventRepo.findById(eventId);
    if(!optEvent.isPresent()) {
      throw new NotFoundException("Event [" + eventId + "] not found");
    }
    return optEvent.get();
  }
}


Display color coded log in React using ANSI-REACT

import Ansi from "ansi-to-react";
import * as React from "react";
import CSSModules from "react-css-modules";
import ReactTable from "react-table";

import "react-table/react-table.css";
import {
  Button,
  Confirm,
  Container,
  Dimmer,
  Header,
  Icon,
  Loader,
  Menu,
  Pagination
} from "semantic-ui-react";
import { servicesConstants } from "../../_constants";
import { history } from "../../_helpers";
import "../../_style/custom-semantic.css";
import { TableMenu } from "../TableMenu";
import { WrappedButton } from "../WrappedButton";

const styles = require("./ServiceLogsTable.less");
const options = {
  allowMultiple: true
};

@CSSModules(styles, options)
export class ServiceLogsTable extends React.Component<any, any> {
  public columns = [
    {
      Header: "Started By",
      accessor: "modifier",
      className: "column-center"
    },
    {
      Header: "Status",
      accessor: "status",
      className: "column-center"
    },
    {
      Header: "Started",
      accessor: "created",
      className: "column-center"
    },
    {
      Header: "Completed",
      accessor: "completed",
      className: "column-center"
    }
  ];

  public subColumns = [
    {
      id: "log",
      Header: "Log",
      Cell: props => (
          <Ansi children={props.original.log} />//Magic for ANSI color happens here!!!
      ),
      className: "column-left",
      style: { "white-space": "pre-wrap" } // allow for words wrap inside only this cell
    }
  ];

  public render() {
    const { events } = this.props;
    if (!events) {
      return null;
    }

    if (events && events.serviceEvents.length === 0) {
      return <Header as="h1">No Data Found</Header>;
    }

    if (events.loading) {
      return (
        <Dimmer active>
          <Loader>Loading</Loader>
        </Dimmer>
      );
    }

    if (events.error) {
      return <div>Error! {events.error.message}</div>;
    }

    return (
      <ReactTable
        width={100}
        filterable
        defaultPageSize={3}
        // defaultExpanded ={{0:true,1:false}}
        data={events.serviceEvents}
        columns={this.columns}
        className="-striped -highlight"
        minRows={1}
        SubComponent={row => {
          return (
            <ReactTable
              width={100}
              data={[events.serviceEvents[row.index]]}
              columns={this.subColumns}
              defaultPageSize={3}
              showPagination={false}
              style={{
                height: "500px" // This will force the table body to overflow and scroll, since there is not enough room
              }}
            />
          );
        }}
      />
    );
  }
}



No comments:

Post a Comment