ååã¾ã§ã®è¨äºï¼
ELK Stackã®å©ç¨1ï¼ã¯ããã«
ELK Stackã®å©ç¨2ï¼ãã°èªååå¾
ELK Stackã®å©ç¨3ï¼ãã°èªååå¾ãã®2
ELK Stackã®å©ç¨4ï¼ESãã¼ã¿æå ¥ï¼Logstashç·¨ï¼
ELK Stackã®å©ç¨5ï¼ESãã¼ã¿æå ¥ï¼Bulk APIç·¨ï¼
Â
ã¾ãæåã«ãã®è©±é¡ãããESã¯ã©ã¹ã¿ã®ãã¼ãã¯VMwareä¸ã®VMã§ãããVMwareã¯ããã©ã«ãã§âãã¤ãã¼ã¹ã¬ããæ©è½âããæå¹åãã«ãªã£ã¦ãããååã®ä¸æ¥åã®ãã°æå ¥æéã30åã¨ããã®ã¯ããã®ç¶æ ã§å®æ½ãããã®ã ã£ãããã®ãã¤ãã¼ã¹ã¬ããããç¡å¹åãã«ããã¨ããããªãã¨ã10åç¨åº¦ã§æå ¥ãçµäºãããã©ãããã®æ©è½ãæå¹ã«ããã¨ãCPUãç®ä¸æ¯ä½¿ãå ´åããã¨ãã°Bulk APIã«ããã¤ã³ãã¯ã·ã³ã°ã®ãããªå ´åã¯æªãæ¹ã¸åãããã
åå¹´åãã¼ã¿ã®æå ¥ã®åã«è©¦é¨ç°å¢ã®æ§æãå¤ãã£ããVMwareä¸ã®3ãã¼ãæ§æã¯ãã®ã¾ã¾ã ããDockerã³ã³ããã«åãã¼ããç½®ããå ¨ä½ãKubernetesã§ç®¡çããå½¢ã«ãªã£ããåãã¼ãã¯4CPUã8GBã¡ã¢ãªã140GB SSDã16TB HDDã§æ§æãOSã¯CentOS7ãããã¦ãESã®ãã¼ã¸ã§ã³ã¯7ã«ä¸ããããã¡ãããã¤ãã¼ã¹ã¬ããæ©è½ã¯ãªãã§ããã
ããã¦ããä¸ã¤ãä»ã¾ã§ã¤ã³ãã¯ã·ã³ã°ã®åä½æ¥ãã¤ã³ããã¯ã¹ä½æããã¼ã¿ã®å å·¥çã®ããã«ãããã¤ãã¹ã¯ãªãããä½ã£ã¦ããã8æ¥éç¨åº¦ã®ãã¼ã¿éã§ã¯ãã®å®è¡æéã¯æ°ã«ãªããªãã£ãããåå¹´éåãã¼ã¿ã§ã¯ç¶æ³ãç°ãªããããã¹ã¯ãªããã®è¦ç´ããè¡ã£ããã¾ããã·ã§ã«ã¹ã¯ãªããå ã®é åã®å©ç¨ããããã¨ããã2åããã£ã¦ãããã®ã1å30ç§ã«ç¸®ã¾ã£ããç¶ãã¦å¥åã«å®è¡ãã¦ããã¹ã¯ãªãããä¸ã¤ã«ã¾ã¨ããä¸åã®ã«ã¼ãã§å¦çããããã«ãããããã«ãpythonã¹ã¯ãªããã§csvããjsonã¸å¤æãã¦ãããããããããªãæéãé£ã£ã¦ãããããä¸è¨ã®ããã«C++ã«æ¸ãå¤ãããããã«ãããpythonã§ããã£ã¦ããæéã1/3ã«åæ¸ã§ããã
/*----------------------------------------------- # chg_csv2json.cpp # Convert csv format files to json. -----------------------------------------------*/ #include <iostream> #include <string> #include <fstream> #include <sstream> #include <vector> using namespace std; /*-------- split --------*/ vector<string> split(string& input, char delimiter) { istringstream stream(input); string field; vector<string> result; while (getline(stream, field, delimiter)) { result.push_back(field); } return result; } /*-------- Main Routine --------*/ int main(int argc, char** argv) { ifstream ifs; ofstream ofs; string csvfile,jsonfile,indexname; string indexstr,line,buf; int pos; // Get csv filename and indexname if(argc != 3){ cout << "Usage: " << argv[0] << " <filename> <indexname>" << "\n"; return 0; } csvfile = jsonfile = argv[1]; indexname = argv[2]; // Write json filename by replacement a suffix: ".csv" pos = (int)jsonfile.rfind("."); jsonfile.replace(pos,4,".json"); // Create a text including indexname indexstr = "{\"index\":{\"_index\":\""+indexname+"\"}}"; // Open csvfile for input ifs.open(csvfile.c_str()); if(!ifs){ cout << "Error: Cannot open file: [" << csvfile << "]\n"; return 0; } // Open jsonfile for output ofs.open(jsonfile.c_str()); // Read file // Get the first line (header) and split them to refer by an array getline(ifs,line); vector<string> f_array = split(line, ','); // Save fields // Get from the second line to the end // Create a string like "field1":"data1","field2":"data2",... while(getline(ifs,line)){ vector<string> d_array(f_array.size(),""); // Get area d_array is expected d_array = split(line, ','); // Save data buf = indexstr + "{" + "\"" + f_array[0] + "\": \"" + d_array[0] + "\""; for(int i=1; i<f_array.size(); i++){ buf += ", \"" + f_array[i] + "\": \"" + d_array[i] + "\""; } // Write to output buf += "}\n"; ofs<<buf; } // Close files ifs.close(); ofs.close(); }
ESæå ¥å段éã§ã¯ããããä½æ¥ãããããã£ãå ã¯ãä¸æ¥åãã¼ã¿ãå¦çããæã®ããããã®æè¦æéãå ¨é¨ã¾ã¨ãã¦10å以ä¸ã«ãªã£ãã
- ãã¨ãã¼ã¿ãã¡ã¤ã«ã®csvåã¨ããã£ã¼ã«ãåã¨ãã®åã®æ½åºï¼4å10ç§ï¼
- ã¤ã³ããã¯ã¹ç¨ã¹ã¯ãªããä½æï¼45ç§ï¼
- csvãjsonãã©ã¼ãããã«å¤æï¼2å30ç§ï¼
- jsonãã¡ã¤ã«ãESæå ¥ã«åãå å·¥ï¼2å10ç§ï¼
ä¸éãæºåãã§ããã®ã§ãåå¹´åãã¼ã¿ï¼4/1ã9/30ï¼ã®ã¤ã³ãã¯ã·ã³ã°ã«é²ãã ãESæå
¥ãã¹ã¯ãªããã§å®è¡ããããæ¬ä½ã¯ while read file; do echo $file; curl -XPOST 'http://log.xxx.yyy.jp:9200/_bulk' -s -H 'Content-Type: application/x-ndjson' --data-binary "@$file" > $file.log; done < all.list
ã§ãall.listã«æå
¥ããå
¨jsonãã¡ã¤ã«åãè¨è¿°ãã¦ãã
å®éã®æå ¥ã¯æ§åãè¦ãªããå°ãã¥ã¤é²ãããã¨ã©ã¼ããããªãã«åºããããå¤ã«æå ¥ãã¦ç¿ææå ¥çµæãã°ã確èªãã¨ã©ã¼ã«å¯¾å¿ãã¨ããæ¥èª²ã«ãªã£ã¦ããã4/1ã4/30ã¾ã§ã®ESæå ¥ã¯ãä¸æ¥åãã°ã®ã¤ã³ãã¯ã·ã³ã°æéãå¹³å19åã ã£ãããã®ãã¡ã«ã¨ã©ã¼ãã ãã¶æ½°ãããã®ã§æå ¥ãã¹ã ã¼ãºã«ãªã£ã¦ãããäºé±éç¨ã§4/1ã8/27ã®ã¤ã³ãã¯ã·ã³ã°çµäºã8/28ãã¼ã¿ã®æå ¥ã§ããã®ã¨ã©ã¼ãåºãã
root_cause":[{"type":"illegal_argument_exception","reason":"Validation Failed: 1: this action would add [2] total shards, but this cluster currently has [15300]/[15300] maximum shards open;""
ã·ã£ã¼ãæ°è¶ãã¨ã©ã¼ã ã£ããã·ã£ã¼ãã¨ã¯ã¯ã©ã¹ã¿ã«ãã¼ã¿ãåæ£ããã¨ãã®åä½ãããã§ã¯1ã·ã£ã¼ã1ã¤ã³ããã¯ã¹ï¼ï¼æ¥å¥ã®ãã¡ã¤ã«ç¨®å¥æ¯ï¼ã§ããã¡ã¤ã«æ°ãå¢ããã°ãã·ã£ã¼ãæ°ã大ãããªããã·ã£ã¼ãæ°ã®å¤æ´ãcurlã使ããåå¹´åãã¼ã¿ã§ã¯ãæçµçã«6900ã«è¨å®ãã¦ãã¾ããã£ãã
$ curl -XPUT 'log.xxx.yyy.jp:9200/_cluster/settings' -H 'Content-type: application/json' --data-binary $'{"transient":{"cluster.max_shards_per_node":6900}}' {"acknowledged":true,"persistent":{},"transient":{"cluster":{"max_shards_per_node":"5400"}}}
ããã§8/28ã9/30ã®ãã°ãã¼ã¿æå ¥ãæ¸ã¿ãåå¹´éã®ãã¼ã¿æå ¥å®äºãã¹ã±ã¼ã«ã大ãããªãã®ã§ãã£ã¨ãããããããã¨æã£ãããæå¤ã«ãããªãããã¾ã§æ¥ãããããªãç¾å¨ã®ã·ã£ã¼ãæ°ã¯ä»¥ä¸ã§ç¢ºèªã§ãããããã¯ãã®å¾ããã«ãã¼ã¿ãæå ¥ããå¾ã§èª¿ã¹ãã¨ãã®ãã®ã
$ curl -XGET 'http://log.xxx.yyy.jp:9200/_cat/health?v' epoch timestamp cluster status node.total node.data shards pri relo init unassign pending_tasks max_task_wait_time active_shards_percent 1592965086 02:18:06 datalog green 3 3 25618 12809 0 0 0 0 - 100.0%
shardsã«ã¯ã¯ã©ã¹ã¿å ¨ä½ã®ã·ã£ã¼ãæ°ã表示ãããä»ã¯3ãã¼ã使ã£ã¦ããã®ã§ã1ãã¼ããããã®ã·ã£ã¼ãæ°ã¯8539ã«ãªãããã®ã³ãã³ãã¯å®ã¯ESã®ãã«ã¹ãã§ãã¯ããããã®ã§ãstatus=greenãactive_shards_percent=100ï¼ ã§ããã°ãESã¨ãã¦åé¡ãªãæ©è½ãã¦ãããã¨ã示ãã
ã¨ããã§ãåå¹´åã®ãã¼ã¿ã¯SSDã«æå ¥ãã¦ãããããã¦ããã®åå¹´åã®ãã¼ã¿ã§SSDã9å²æ¹ä½¿ã£ã¦ãã¾ã£ããããSSDã«å ¥ãããã¼ã¿ã¯å¤§å®¹éã®HDDã¸ç§»ããã以å¾ã¯HDDã¸ç´æ¥æå ¥ãããã¨ã«ãªã£ããããã«ä¸ã¶æå追å æå ¥ãjavaãã¼ããµã¤ãºãã·ã£ã¼ãæ°ãå¤ãããããªã©ãã¦æåãä»ã¾ã§ã¤ã³ãã¯ã·ã³ã°æéã¯20åå¼±ã ã£ãã®ã ããHDDã¯ãã¯ãé ããæå ¥æéã¯1.5åã«å¢ããã
ããã«ããä¸ã¶æåãã¨ããã¨ããã§åé¡çºçããã«ã¹ãã§ãã¯ã§"yellow"ãåºãã
$ curl -XGET 'http://log.xxx.yyy.jp:9200/_cat/health?v' epoch timestamp cluster status node.total node.data shards pri relo init unassign pending_tasks max_task_wait_time active_shards_percent 1594951163 01:59:23 datalog yellow 3 3 26311 14989 0 2 3665 0 - 87.8%
ESã®åãã¼ãã調ã¹ãã¨ãããããããã¡ã¢ãªã9å²ç¨ä½¿ç¨ãããã«ãã·ã£ã¼ãæ°ãå¤ããããã¤åã·ã£ã¼ããµã¤ãºãå°ãããããã¨ãããã¨ãããã£ã¦ãããæ¥å¥ãã¡ã¤ã«ç¨®å¥ãã¨ã«1ã¤ã³ããã¯ã¹ãã¨ãã¦ãããã¨ãè£ç®ã«åºããã¾ãå¤ãããã¤ã³ããã¯ã¹ãã¡ã¢ãªãé£ã£ã¦ãããããã¦ã·ã£ã¼ããµã¤ãºã¯Elasticsearchã®æ¨å¥¨ã20ã40GBãããã©æã ã®ç°å¢ã§ã¯æ大ã§600MBç¨åº¦ã§ãããã¯ãªã¼ãã¼ããããé«ãããã·ã¹ãã ããã©ã¼ãã³ã¹ãæªãããããã ã
ã¨ããããã§ãè¨è¨ããè¦ç´ããªããã°ãªããªããªã£ãã1ã¤ã³ããã¯ã¹ãæ¥å¥ã§ãªãæå¥ã«ããæ¹éã«å¤æ´ãã¹ã¯ãªããã®å¤æ´ã¯ãã¤ã³ããã¯ã¹ã®ä½æã¨ESæå ¥æã®ã¤ã³ããã¯ã¹åãå¤æ´ããã ãã§å¤§ããªãã®ã§ã¯ãªãã£ãã®ã ãããã®è©¦é¨ã«ã¤ãã¦ã¯ããã§çµäºã«ãªã£ãããã§ã«ESã«æå ¥ããããã¼ã¿ã®æ±ãã«è©±ãã·ãããã¦ãã£ããã¨ã¨ãæ¬åã®ä»äºãå¿ãããªããã®ä½æ¥ãç¶ããããªããªã£ããã¨ã®ããã§ãããä¸éå端ã§æ®å¿µã ã£ãããæ¬åã§ã¯ãªããè´ãæ¹ãªããESã«ã¤ãã¦ããããããã£ããã¨ã§è¯ãã¨ããã
Â
2024/12/18ï¼æ°´ï¼Blog 21