cluster.common package

Submodules

cluster.common.common_node module

class cluster.common.common_node.WorkFlowCommonNode[source]

Bases: object

wdnn을 위한 load data를 위한 빈 메소드 생성

check_next()[source]

check if next nodes are all searched :param name: :return:

check_prev()[source]

check if prev nodes are all searched :param name: :return:

decode_pad(input_list, max_len=0, pad_char='#', start_char='@')[source]

[pad_char] * pad_len + input :param pos: :return:

encode_pad(input_list, max_len=0, pad_char='#')[source]
Parameters:pos
Returns:
find_next_node(node_name, node_list)[source]

find next node and return name :param node_name: :param node_list: :return:

find_prev_node(node_name, node_list)[source]

find prev node and return name :param node_name: :param node_list: :return:

get_cluster_exec_class(node_id)[source]

get execute class path :param node_id: :return:

get_linked_next_node_with_grp(grp)[source]

get linked node forward with type :param type: :return:

get_linked_next_node_with_type(type)[source]

get linked node forward with type bug fix prev node to next node :param type: :return:

get_linked_prev_node_with_cond(val, cond='has_value')[source]

get linked node prev until find node which have specific parm :param type: :return:

get_linked_prev_node_with_grp(grp)[source]

get linked node prev with type :param type: :return:

get_linked_prev_node_with_type(type)[source]

get linked node forward with type :param type: :return:

get_net_id()[source]

set flag for tree search :return:

get_net_node_id()[source]

set flag for tree search :return:

get_net_ver()[source]

set flag for tree search :return:

get_next_node(grp=None, type=None)[source]

next node class :param name: :return:

get_next_node_as_dict()[source]

next node class :param name: :return:

get_node_def()[source]

node name string :param name: :return:

get_node_grp()[source]

node name string :param name: :return:

get_node_name()[source]

node name string :param name: :return:

get_node_type()[source]

node name string :param name: :return:

get_onehot_vector(sent)[source]

convert sentecne to vector :return: list

get_onehot_word(vec_list)[source]

convert sentecne to vector :return: list

get_prev_node(grp=None, type=None)[source]

prev_node class :param name: :return:

get_prev_node_as_dict()[source]

prev_node class :param name: :return:

get_search_flag()[source]

set flag for tree search :return:

load_class(class_path, class_name)[source]

return class with name :param module_name: :param class_name: :return: Class

load_data(node_id, parm='all')[source]
run(conf_data)[source]
set_net_id(net_id)[source]

set flag for tree search :return:

set_net_node_id(node_id)[source]

set flag for tree search :return:

set_net_ver(net_ver)[source]

set flag for tree search :return:

set_next_node(key, node_cls)[source]

next node class :param name: :return:

set_node_def(name)[source]

node name string :param name: :return:

set_node_grp(node_grp)[source]

node name string :param name: :return:

set_node_name(node_name)[source]

node name string :param name: :return:

set_node_type(node_type)[source]

node name string :param name: :return:

set_prev_node(key, node_cls)[source]

prev_node class :param name: :return:

set_search_flag()[source]

set flag for tree search :return:

cluster.common.neural_common_bilismcrf module

class cluster.common.neural_common_bilismcrf.BiLstmCommon[source]

Bases: object

common functions for bilstm crf

class CoNLLDataset(filename, processing_word=None, processing_tag=None, max_iter=None, all_line=True)[source]

Bases: object

Class that iterates over CoNLL Dataset

BiLstmCommon.NONE = 'O'
BiLstmCommon.NUM = '$NUM$'
BiLstmCommon.UNK = '$UNK$'
BiLstmCommon.export_trimmed_glove_vectors(vocab, model, trimmed_filename)[source]

Saves glove vectors in numpy array

Args:
vocab: dictionary vocab[word] = index glove_filename: a path to a glove file trimmed_filename: a path where to store a matrix in npy dim: (int) dimension of embeddings UNK = “$UNK$” NUM = “$NUM$” NONE = “O”
BiLstmCommon.get_char_vocab(dataset, chars=None)[source]
Args:
dataset: a iterator yielding tuples (sentence, tags)
Returns:
a set of all the characters in the dataset
BiLstmCommon.get_chunk_type(tok, idx_to_tag)[source]
BiLstmCommon.get_chunks(seq, tags)[source]
Args:
seq: [4, 4, 0, 0, ...] sequence of labels tags: dict[“O”] = 4
Returns:
list of (chunk_type, chunk_start, chunk_end)
Example:
seq = [4, 5, 0, 3] tags = {“B-PER”: 4, “I-PER”: 5, “B-LOC”: 3} result = [(“PER”, 0, 2), (“LOC”, 3, 4)]
BiLstmCommon.get_processing_word(vocab_words=None, vocab_chars=None, lowercase=False, chars=False)[source]
Args:
vocab: dict[word] = idx
Returns:
f(“cat”) = ([12, 4, 32], 12345)
= (list of char ids, word id)
BiLstmCommon.get_trimmed_glove_vectors(filename)[source]
Args:
filename: path to the npz file
Returns:
matrix of embeddings (np array)
BiLstmCommon.get_vocabs(datasets, vocab=None, tags=None)[source]
Args:
datasets: a list of dataset objects
Return:
a set of all the words in the dataset
BiLstmCommon.load_vocab(filename)[source]
Args:
filename: file with a word per line
Returns:
d: dict[word] = index
BiLstmCommon.minibatches(data, minibatch_size)[source]
Args:
data: generator of (sentence, tags) tuples minibatch_size: (int)
Returns:
list of tuples
BiLstmCommon.pad_sequences(sequences, pad_tok, nlevels=1)[source]
Args:
sequences: a generator of list or tuple pad_tok: the char to pad with
Returns:
a list of list where each sublist has same length
BiLstmCommon.write_char_embedding(vocab, trimmed_filename)[source]

Writes a vocab to a file

Args:
vocab: iterable that yields word filename: path to vocab file
Returns:
write a word per line
BiLstmCommon.write_vocab(vocab, filename)[source]

Writes a vocab to a file

Args:
vocab: iterable that yields word filename: path to vocab file
Returns:
write a word per line

cluster.common.neural_common_monitor module

class cluster.common.neural_common_monitor.NeuralCommonMonitor[source]

Bases: object

cluster.common.neural_common_wdnn module

class cluster.common.neural_common_wdnn.NeuralCommonWdnn[source]

Bases: object

df_validation(df, dataconf)[source]
input_fn(df, nnid, dataconf)[source]

Wide & Deep Network input tensor maker V1.0 16.11.04 Initial

:param df : dataframe from hbase :param df, nnid :return: tensor sparse, constraint
wdnn_build(model_type, nodeid, hidden_layers, activation_fn, dataconf, model_dir, train=True, dememsion_auto_flag=False)[source]

wide & deep netowork builder :param nnid :param model_dir : directory of chkpoint of wdnn model :param train : train or predict :return: tensorflow network model

cluster.common.train_summary_accloss_info module

class cluster.common.train_summary_accloss_info.TrainSummaryAccLossInfo(conf=None)[source]

Bases: object

get_acc_info()[source]
get_loss_info()[source]
get_nn_batch_ver_id()[source]
get_nn_id()[source]
get_nn_wf_ver_id()[source]
set_acc_info(acc)[source]
set_loss_info(loss)[source]
set_nn_batch_ver_id(nn_batch_ver_id)[source]
set_nn_id(nn_id)[source]
set_nn_wf_ver_id(nn_wf_ver_id)[source]

cluster.common.train_summary_info module

class cluster.common.train_summary_info.TrainSummaryInfo(conf=None, type=None)[source]

Bases: object

get_accuracy()[source]

return test accuracy :return: float

get_nn_batch_ver_id()[source]
get_nn_id()[source]
get_nn_wf_ver_id()[source]
get_result_info()[source]
save_result_info(result)[source]
set_nn_batch_ver_id(nn_batch_ver_id)[source]
set_nn_id(nn_id)[source]
set_nn_wf_ver_id(nn_wf_ver_id)[source]
set_result_data_format(config)[source]

set config parms and result form is necessary before use :param config: :return:

set_result_info(label, predict, input=None, acc=None, coord_x=None, coord_y=None)[source]

Module contents