elasticsearch from source to debug in the eclipse


git clone https://github.com/elasticsearch/elasticsearch.git

// start eclipse

./eclipse

// import eclasticsearch to eclipse

File->Import->Maven->Exsiting Maven Project

// and browse to elasticsearch'source folder where pom.xml is located

// run org.elasticsearch.bootstrap.Bootstrap

// to activate log setting, update logging.yml in elasticsearch/config

file: ${path.logs}/${cluster.name}.log ->file: /tmp/elasticsearch.log

es.logger.level: INFO -> es.logger.level: DEBUG

//see logs

tail -f /tmp/elasticsearch.log

// an example

curl -XPUT "http://localhost:9200/movies/movie/1" -d'
{
"title": "The Godfather",
"director": "Francis Ford Coppola",
"year": 1972
}'

Advertisements

a sample one to many association model in sails.js


sails generate new MySample --no-front-end
cd MySample
npm install
sails generate api pet
sails generate api user

api/models/Pet.js

module.exports = {
attributes: {
name: 'string',
color: 'string',
owner: {
model: 'user'
}

}
};

api/models/User.js

module.exports = {
attributes: {
name: 'string',
age: 'integer',
pets: {
collection: 'pet',
via: 'owner'
}
}
};

sails lift

curl -X POST "http://localhost:1337/user?name=farshad&age=35"
curl -X POST "http://localhost:1337/pet?name=papi&color=black&owner=1"

curl -X GET "http://localhost:1337/user"
[
{
"pets": [
{
"name": "papi",
"color": "black",
"owner": 1,
"createdAt": "2014-11-20T13:19:16.612Z",
"updatedAt": "2014-11-20T13:19:16.612Z",
"id": 1
}
],
"name": "farshad",
"age": 35,
"createdAt": "2014-11-20T13:16:29.250Z",
"updatedAt": "2014-11-20T13:16:29.250Z",
"id": 1
}
]

curl -X GET "http://localhost:1337/pet/1"
{
"owner": {
"name": "farshad",
"age": 35,
"createdAt": "2014-11-20T13:16:29.250Z",
"updatedAt": "2014-11-20T13:16:29.250Z",
"id": 1
},
"name": "papi",
"color": "black",
"createdAt": "2014-11-20T13:19:16.612Z",
"updatedAt": "2014-11-20T13:19:16.612Z",
"id": 1
}

Asterisk Call Out Error

create a text file

root@local:/var/spool/asterisk/outgoing# emacs -nw /tmp/hello-world.call

put following call data in it


Channel: Zap/g0/09173371425
Application: Playback
Data: hello-world

copy it in /var/spool/asterisk/outgoing folder, then check the log file in /var/log/asterisk/full

I got an error


[Nov 18 13:36:16] NOTICE[20501] channel.c: Unable to request channel Zap/g0/09173371425
[Nov 18 13:36:16] NOTICE[20501] pbx_spool.c: Call failed to go through, reason (8) Congestion (circuits busy)

then I restarted the Asterisk and check log file again.

I found out the following log in startup.


[Nov 18 13:40:35] VERBOSE[20562] logger.c: == Primary D-Channel on span 2 up
[Nov 18 13:40:45] VERBOSE[20562] logger.c: -- B-channel 0/1 successfully restarted on span 2

so I change my call file


Channel: Zap/g1/09173371425
Application: Playback
Data: hello-world

because the log shows that Asterisk starts span 2.then it calls back and plays a hello world sound.

install cassandra in Windows

after installing cassandra 2.1.2 in Windows 8 and starting it I got the following error:

ERROR 20:20:43 Exception encountered during startup
java.lang.RuntimeException: Incompatible SSTable found. Current version ka is unable to read file: \var\lib\cassandra\da
ta\system\schema_keyspaces\system-schema_keyspaces-ic-1. Please run upgradesstables.
at org.apache.cassandra.db.ColumnFamilyStore.createColumnFamilyStore(ColumnFamilyStore.java:427) ~[apache-cassan
dra-2.1.2.jar:2.1.2]
at org.apache.cassandra.db.ColumnFamilyStore.createColumnFamilyStore(ColumnFamilyStore.java:404) ~[apache-cassan
dra-2.1.2.jar:2.1.2]
at org.apache.cassandra.db.Keyspace.initCf(Keyspace.java:327) ~[apache-cassandra-2.1.2.jar:2.1.2]
at org.apache.cassandra.db.Keyspace.(Keyspace.java:280) ~[apache-cassandra-2.1.2.jar:2.1.2]
at org.apache.cassandra.db.Keyspace.open(Keyspace.java:122) ~[apache-cassandra-2.1.2.jar:2.1.2]
at org.apache.cassandra.db.Keyspace.open(Keyspace.java:99) ~[apache-cassandra-2.1.2.jar:2.1.2]
at org.apache.cassandra.db.SystemKeyspace.checkHealth(SystemKeyspace.java:558) ~[apache-cassandra-2.1.2.jar:2.1.
2]
at org.apache.cassandra.service.CassandraDaemon.setup(CassandraDaemon.java:214) [apache-cassandra-2.1.2.jar:2.1.
2]
at org.apache.cassandra.service.CassandraDaemon.activate(CassandraDaemon.java:448) [apache-cassandra-2.1.2.jar:2
.1.2]
at org.apache.cassandra.service.CassandraDaemon.main(CassandraDaemon.java:537) [apache-cassandra-2.1.2.jar:2.1.2

it will be solved with :


in

E:\DataStax Community\apache-cassandra\conf\cassandra.yaml

change to

$CASSANDRA_HOME/data/data.
data_file_directories:
- E:\DataStax Community\apache-cassandra\data

and

$CASSANDRA_HOME/data/commitlog.
commitlog_directory: E:\DataStax Community\apache-cassandra\log

also I got the following error when I load a trigger with: nodetool -h localhost reloadtriggers

Trigger directory doesn't exist, please create it and try again.

to solve it


in

E:\DataStax Community\apache-cassandra\bin\cassandra.bat

change to

-Dlogback.configurationFile=logback.xml^
-Dcassandra.triggers_dir="E:\DataStax Community\apache-cassandra\conf\triggers"

nodejs-express4-mysql in 2 minutes

a sample application that shows how can be created a restful api service with node.js + express4 + mysql.

just clone the following project and follow its readme.

git clone https://github.com/pjanaya/nodejs-express4-mysql

and test with:

curl –data “username=irman&password=123456&press=%20OK%20” http://localhost:8080/api/users/

{“message”:”User created!”}

curl -XGET “http://localhost:8080/api/users/4″ -d ”

{“id”:4,”username”:”irman”,”password”:”7c4a8d09ca3762af61e59520943dc26494f8941b”}

create a Cassandra Trigger with User Data Type and Set Type

create schema


CREATE KEYSPACE testkeystore WITH replication = {'class':'SimpleStrategy', 'replication_factor':3};

CREATE TYPE testkeystore.phone (
name text,
number int
);

CREATE TYPE testkeystore.addressdetail (
number int,
street text,
code int,
city text
);

CREATE TYPE testkeystore.address (
name text,
howlongyear int,
addressdetail frozen

);

CREATE TYPE testkeystore.contact (

email set,
);

CREATE TABLE testkeystore.person(
person_id text PRIMARY KEY,
firstname text,
lastname text,
age int,
jobs set,
address set< frozen <address>>,
contact frozen
);

trigger source

package com.cassandra.trigger;

public class ElasticsearchTrigger implements ITrigger {
private static final Logger logger = LoggerFactory
.getLogger(ElasticsearchTrigger.class);

public Collection augment(ByteBuffer key, ColumnFamily update) {
CFMetaData cfm = update.metadata();
String localKey = cfm.getKeyValidator().getString(key);
logger.info("key={}.", localKey);
Map setMap = new HashMap();
Map simpleMap = new HashMap();
StringBuffer json = new StringBuffer();
json.append("{" + wrappedDoubleQuotes("id") + ":" + localKey+",");

for (Cell cell : update) {

if (cell.name().isCollectionCell()
&& cfm.getValueValidator(cell.name()) instanceof SetType) {
String name = wrappedDoubleQuotes(cell.name()
.cql3ColumnName(cfm).toString());

SetType setType = ((SetType) cfm.getValueValidator(cell.name()));
String value = getSetTypeValue(cell.name().collectionElement(),
setType);
if (setType.elements instanceof UserType) {
StringBuffer utsb = getUserTypeValue(cell.name()
.collectionElement(), (UserType) setType.elements);
value = utsb.toString();
} else {
if (setType.elements instanceof Int32Type) {
value = setType.elements.getString(cell.name()
.collectionElement());
} else if (setType.elements instanceof UTF8Type) {
value = wrappedDoubleQuotes(setType.elements
.getString(cell.name().collectionElement()));
}
}

if (value != null && !value.isEmpty()) {
if (setMap.get(name) == null)
setMap.put(name, value);
else {
setMap.put(name, setMap.get(name) + "," + value);
}
}
} else {
String name = wrappedDoubleQuotes(cfm.comparator.getString(cell
.name()));
String value = cfm.getValueValidator(cell.name()).getString(
cell.value());
if (cfm.getValueValidator(cell.name()) instanceof UTF8Type)
simpleMap.put(name, wrappedDoubleQuotes(value));
else if (cfm.getValueValidator(cell.name()) instanceof Int32Type)
simpleMap.put(name, value);
else if (cfm.getValueValidator(cell.name()) instanceof UserType)
simpleMap.put(
name,
getUserTypeValue(
cell.value(),
(UserType) cfm.getValueValidator(cell
.name())).toString());
}
}
boolean flag = false;
for (String name : simpleMap.keySet()) {
if (!name.isEmpty()) {
if (flag)
json.append(",");
flag = true;
json.append(name);
json.append(":");
json.append(simpleMap.get(name));
}

}

for (String name : setMap.keySet()) {
if (flag)
json.append(",");
json.append(name);
json.append(":");
json.append("[");
json.append(setMap.get(name));
json.append("]");
}

json.append("}");
logger.info("json={}.", json.toString());
return null;
}

private StringBuffer getUserTypeValue(ByteBuffer values, UserType userType) {
StringBuffer utsb = new StringBuffer();
utsb.append("{");
boolean flag = false;
for (int i = 0; i < userType.size(); i++) {
ByteBuffer[] ByteBuffers = userType.split(values);
String userTypeValue = userType.fieldType(i).getString(
ByteBuffers[i]);
if (flag)
utsb.append(",");
flag = true;
utsb.append(wrappedDoubleQuotes(UTF8Type.instance
.getString(userType.fieldName(i))));
utsb.append(":");
if (userType.fieldType(i) instanceof UTF8Type) {
utsb.append(wrappedDoubleQuotes(userTypeValue));
} else {
if (userType.fieldType(i) instanceof Int32Type) {
utsb.append(userTypeValue);
} else if (userType.fieldType(i) instanceof UserType) {
utsb.append(getUserTypeValue(ByteBuffers[i],
(UserType) userType.fieldType(i)));
} else if (userType.fieldType(i) instanceof SetType) {
utsb.append(getSetTypeValue2(ByteBuffers[i],
(SetType) userType.fieldType(i)));
}

}

}
utsb.append("}");
return utsb;
}

private String getSetTypeValue(ByteBuffer bytevalue, SetType setType) {
String value = "";
if (setType.elements instanceof UserType) {
value = getUserTypeValue(bytevalue, (UserType) setType.elements)
.toString();
} else {
if (setType.elements instanceof UTF8Type)
value = "'" + setType.elements.getString(bytevalue) + "'";
else if (setType.elements instanceof Int32Type)
value = setType.elements.getString(bytevalue);
}
return value;

}

private String getSetTypeValue2(ByteBuffer bytevalue, SetType setType) {
StringBuffer value = new StringBuffer();
value.append("[");
Set s = (Set) setType.getSerializer()
.deserializeForNativeProtocol(bytevalue, 3);
ByteBuffer input = bytevalue.duplicate();
int n = org.apache.cassandra.serializers.CollectionSerializer
.readCollectionSize(input, 3);
Set setbb = new LinkedHashSet(n);
for (int i = 0; i < n; i++) {
ByteBuffer databb = org.apache.cassandra.serializers.CollectionSerializer
.readValue(input, 3);
setbb.add(databb);
}
boolean flag = false;
for (Object o : s) {
if (flag)
value.append(",");
flag = true;
if (setType.elements instanceof UserType) {
value.append(getUserTypeValue(bytevalue,
(UserType) setType.elements).toString());
} else {
if (setType.elements instanceof UTF8Type)
value.append(wrappedDoubleQuotes((String) o));
else if (setType.elements instanceof Int32Type)
value.append(setType.elements.getString(bytevalue));
}
}
value.append("]");
return value.toString();

}

private String wrappedDoubleQuotes(String value) {
return "\"" + value + "\"";
}
}

deploy trigger


cd ELASTICSEARCH_CASSANDRA_TRIGGER_HOME
mvn clean install
sudo cp target/elasticsearch-cassandra-trigger-0.0.1-SNAPSHOT.jar /etc/cassandra/triggers
nodetool -h localhost reloadtriggers
cqlsh
CREATE TRIGGER IF NOT EXISTS elasticsearchtrigger ON testkeystore.person USING 'com.cassandra.trigger.ElasticsearchTrigger';

execute update


INSERT INTO testkeystore.person(person_id,firstname,lastname,age,jobs,address,contact) VALUES('123456','samplefirstname','samplelastname',18,{'job1','job2'},{{name:'home',howlongyear:10,addressdetail:{number:123,street:'samplestreet',code:4569,city:'samplecity'}},{name:'work',howlongyear:2,addressdetail:{number:456,street:'samplestreet2',code:1234,city:'samplecity2'}}},{ email:{'email1@example.com','email2@example.com'} });

check log with


sudo tail -f /var/log/cassandra/system.log

INFO [SharedPool-Worker-1] 2014-11-11 10:26:35,838 ElasticsearchTrigger.java:111 - json={"id":123456,"firstname":"samplefirstname","age":18,"lastname":"samplelastname","contact":{"email":["email1@example.com","email2@example.com"]},"jobs":["job1","job2"],"address":[{"name":"home","howlongyear":10,"addressdetail":{"number":123,"street":"samplestreet","code":4569,"city":"samplecity"}},{"name":"work","howlongyear":2,"addressdetail":{"number":456,"street":"samplestreet2","code":1234,"city":"samplecity2"}}]}.

cassandra trigger

to run the cassandra 2.1.1 InvertedIndex trigger sample from
https://github.com/apache/cassandra/tree/trunk/examples/triggers

 

git clone https://github.com/apache/cassandra.git
cd cassandra
ant
cd examples/triggers/
ant jar
sudo cp build/trigger-example.jar /etc/cassandra/triggers
sudo cp conf/* /etc/cassandra/
nodetool -h localhost reloadtriggers
cqlsh

cqlsh>CREATE KEYSPACE "Keyspace1" WITH replication = {'class':'SimpleStrategy', 'replication_factor':3};

cqlsh>CREATE TABLE IF NOT EXISTS "Keyspace1"."InvertedIndex"(
user_id text PRIMARY KEY,
user_name text
);

cqlsh>CREATE TABLE IF NOT EXISTS "Keyspace1"."Standard1"(
user_id text PRIMARY KEY,
user_name text
);

cqlsh>CREATE TRIGGER test1 ON "Keyspace1"."Standard1" USING 'org.apache.cassandra.triggers.InvertedIndex';

cqlsh>INSERT INTO "Keyspace1"."Standard1"(user_id,user_name) VALUES('1','Bob');

cqlsh> select  * from "Keyspace1"."Standard1";

user_id | user_name
---------+-----------
1 |       Bob

(1 rows)
cqlsh> select  * from "Keyspace1"."InvertedIndex";

user_id | user_name
---------+-----------
Bob |         1