Install Elasticsearch on CentOs

Download and install the public signing key:

rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch

Installing from the RPM repositoryedit

Create a file called elasticsearch.repo in the /etc/yum.repos.d/ directory for RedHat based

distributions, or in the /etc/zypp/repos.d/ directory for OpenSuSE based distributions, containing:

[elasticsearch-6.x]
name=Elasticsearch repository for 6.x packages
baseurl=https://artifacts.elastic.co/packages/6.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md

install 5.x

[elasticsearch-5.x]
name=Elasticsearch repository for 5.x packages
baseurl=https://artifacts.elastic.co/packages/5.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md

 

And your repository is ready for use. You can now install Elasticsearch with one of the following commands:

sudo yum install elasticsearch 


installation output:


starting up and checking


test the status

Confluent platform quick start

https://docs.confluent.io/3.1.1/quickstart.html#quickstart

using the Oracle developer VM to start with (save time for Centos OS, etc)

 

Fix during the steps
1. Change the port of registry in the properties file

/etc/schema-registry/schema-registry.properties

(if the port 8081 is being used)

2. to start the avro producer, specify the registry url since the default port 8081 is changed

./usr/bin/kafka-avro-console-producer –broker-list localhost:9092 –topic test –property value.schema='{“type”:”record”,”name”:”myrecord”,”fields”:[{“name”:”f1″,”type”:”string”}]}’ –property schema.registry.url=http://localhost:8087

change the path accordingly for starting the zookeeper, kafka and registry

 ./bin/kafka-server-start ./etc/kafka/server.properties

to

 ./usr/bin/kafka-server-start ./etc/kafka/server.properties

Curl post command issue

Error:

1)

curl -H ‘Content-Type:  application/x-ndjson’ -XPOST ‘localhost:9200/bank/account/_bulk?pretty’ –data-binary @accounts.json

Working: (remove the space after “Content-Type:”)

http not support error

curl -H ‘Content-Type: application/x-ndjson’ -XPOST ‘localhost:9200/bank/account/_bulk?pretty’ –data-binary @accounts.json

correct syntax for windows  (change single quote to double quote)

curl -H ‘Content-Type:application/x-ndjson’ -XPOST ‘http://localhost:9200/bank/account/_bulk?pretty’ –data-binary @C:\BDS\KibanaData\accounts.json

right:

curl -H “Content-Type:application/x-ndjson” -XPOST “http://localhost:9200/bank/account/_bulk?pretty” –data-binary @C:\BDS\KibanaData\accounts.json

and it works!

 

 

How to run Confluent from Windows

  1. Start ZooKeeper Server. From confluent-3.0.1\bin\windows directory run below command. zookeeper-server-start.bat ..\..\etc\kafka\zookeeper.properties
  2. Start Kafka Server. kafka-server-start.bat ..\..\etc\kafka\server.properties
  3. Start Schema Registry: schema-registry-start.bat ..\..\etc\schema-registry\schema-registry.properties
  4. Start producer. kafka-console-producer.bat –broker-list localhost:9092 –topic text.test
  5. Start Consumer. kafka-console-consumer.bat –topic text.test –from-beginning –zookeeper localhost:2181
  6. Then start pumping messages from producer and you should receive it on consumer.