ã¨ã ã¹ãªã¼ã¨ã³ã¸ãã¢ãªã³ã°ã°ã«ã¼ã AIã»æ©æ¢°å¦ç¿ãã¼ã ã§ã½ããã¦ã§ã¢ã¨ã³ã¸ãã¢ããã¦ããä¸æ(po3rin) ã§ãã好ããªè¨èªã¯Goãæ å ±æ¤ç´¢ç³»ã®è©±ã好ç©ã§ããä»åã¯å¤æ§æ§ãèæ ®ãããªã©ã³ãã³ã°ææ³ã¨å®éã«Pythonã§å®è£ ã試ãã話ããã¾ãã
課é¡
ã¨ã ã¹ãªã¼ã®AskDoctorsã¨ãããµã¼ãã¹ã§ã¯ã¦ã¼ã¶ã¼ããå»è ããã«ç´æ¥è³ªåãã§ãããµã¼ãã¹ã§ãã質åã§ãã以å¤ã«ããã¦ã¼ã¶ã¼ã¯èªåã®çç¶ãæ©ã¿ã«è¿ã質åãæ¤ç´¢ã§ãã¾ãã
Askdoctorsã®æ¤ç´¢ã®åé¡ç¹ã¨ãã¦ãåããããªã¿ã¤ãã«ã並ãã§ãã¾ãã¨ããåé¡ãããã¾ããã¦ã¼ã¶ã¼ã®æ稿ããã¥ã¡ã³ãã®ã¿ã¤ãã«ã«ã¯ã~~ã«ã¤ãã¦ããªã©å ·ä½çãªè³ªåå 容ãã¿ã¤ãã«ããåãããªããã¨ãå¤ãã§ãã
ä¾ãã°AskDoctorsã§ãã³ãã ã¯ã¯ãã³ãã¨æ¤ç´¢ããã¨ã質åæ°ãé常ã«å¤ããããä¼¼ããããªã¿ã¤ãã«ãä¸ä½ã«ä¸¦ã³ãã©ããèªåãèªã¿ãã質åãªã®ããæ¤ç´¢çµæããã¯å¤æã§ãã¾ãããä¸è¨ã¯ã¨ããæ¥ã®AskDoctorsã®ãã³ãã ã¯ã¯ãã³ãã§ã®æ¤ç´¢çµæã§ãã
ã³ããã¦ã¤ã«ã¹ã®ã¯ã¯ãã³ ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ ã³ããã¦ã¤ã«ã¹ã®ã¯ã¯ãã³ ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ ã³ããã¯ã¯ãã³ ã³ããã®ã¯ã¯ãã³ ã³ããã¯ã¯ãã³ ã³ããã¯ã¯ãã³ ã³ããã¯ã¯ãã³ããã¦ã ã³ããã¯ã¯ãã³ ã³ããã¦ã¤ã«ã¹ã®ã¯ã¯ãã³ã«ã¤ã㦠ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ã«ã¤ãã¦ã ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ã«ã¤ã㦠ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ã«ã¤ã㦠ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ã«ã¤ã㦠ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ã«ã¤ã㦠ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ã«ã¤ã㦠ã³ããã¦ã¤ã«ã¹ã®ã¯ã¯ãã³ã«ã¤ã㦠ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ã«ã¤ã㦠ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ã¨è¬
ãããªãçç±ã¨ãã¦ã¯ãBM25ã®ã¢ã«ã´ãªãºã ã®ç¹æ§ãæãããã¾ããç°¡ç¥åã®ããã«ã¿ã¤ãã«ã ããæ¤ç´¢å¯¾è±¡ã®å ´åãèããã¨ãBM25ã®è¨ç®ã§ã¯æ¤ç´¢ã¿ã¼ã ãåºç¾ãããã¤ãçãã¿ã¤ãã«ã®ã¹ã³ã¢ãé«ããªãã¾ãããã®ãããããã¥ã¡ã³ããå¤ãããã¤ãä¼¼ããããªã¿ã¤ãã«ãå¤ããªãã¦ã¼ã¶ã¼æ稿åã®ãµã¼ãã¹ã®å ´åã¯ä¸è¨ã®ããã«ã¿ã¤ãã«ãä¼¼ããããªãã®ã«ãªã£ã¦ãã¾ãã¾ããç¹ã«å¼ç¤¾ã§ã¯ã¿ã¤ãã«ãã£ã¼ã«ããã¹ã³ã¢ãã¼ã¹ããã¦ããã®ã§ãããã¿ã¤ãã«ãåã£ã¦ãã¾ãå¾åã«ããã¾ãã
ç¹ã«AskDoctorsã®ãããªå°éçãªè³ªåãµã¤ãã§ã¯è³ªåãä¸è¨ã§è¨ããªãå ´åãå¤ãã®ã§ãã¿ã¤ãã«ãæ½è±¡çã§ä¼¼éã£ã¦ãã¾ãç¹ãæãããã¾ãã
ä½è«ã§ããããã£ã¼ã«ããã¨ã«ã¹ã³ã¢ãã¼ã¹ããããå ´åã¯æè¿Elasticsearchã«ãå ¥ã£ãBM25Fã¢ã«ã´ãªãºã ãããããã§ããBM25Fã¯ã¿ã¤ãã«ãããã£ãªã©ã®ããã¥ã¡ã³ãæ§é ãèæ ®ã«å ¥ããBM25ã®èªç¶ãªæ¡å¼µã§ããBM25Fã¯ã¿ã¼ã é »åº¦()ããã£ã¼ã«ããã¨ã«ãã£ã¼ã«ãéã¿ãä¹ç®ãã¦èª¿æ´ãã¾ãã
BM25,BM25Fã®è§£èª¬ã«é¢ãã¦ã¯ä¸è¨ã®æ¸ç±ã詳ããã§ãããã¤ããªç¬ç«ã¢ãã«ããã®BM25ãBM25Fã®å°å ¥ãããªã詳ããæ¸ãã¦ããã¾ãã
ä»åã¯æ¤ç´¢çµæã®CTRãé«ããã¨ããç®çã§ãæ¤ç´¢çµæã®ã¿ã¤ãã«ã対象ã«å¤æ§æ§ãé«ããææ³ã«åãçµãã§ã¿ã¾ãã
æ¤ç´¢çµæã®ãªã©ã³ãã³ã°
æ¤ç´¢ã¨ã³ã¸ã³ããçæãããçµæãªã¹ãããªã©ã³ãã³ã°ãã¤ã¤ãã¯ã¨ãªã¨ã®é¢é£æ§ãç¶æããªããå¤æ§æ§ãé«ãããã¨ãèãã¾ãã
ä¸çªç°¡åãªã®ã¯è²ªæ¬²ãªãªã©ã³ãã³ã°æ¹æ³ã§ããã¢ã«ã´ãªãºã ã¯ä¸è¨ã¯è«æ[1]ããã®å¼ç¨ã«ãªãã¾ãã
ç°¡åã«èª¬æããã¨æ¤ç´¢çµæãªã¹ãããããªã©ã³ãã³ã°çµæãªã¹ãã«å¯¾ãã¦ãã£ã¨ãä¼¼ã¦ããªãã¢ã¤ãã ããªã©ã³ãã³ã°çµæãªã¹ãã«è¿½å ããå¦çãç¹°ãè¿ãã¦ããæ¹æ³ã§ããã¯æ¤ç´¢ã¨ã³ã¸ã³ããè¿ã£ã¦ããåè£éåã§ãããã¯æçµçã«è¿ããªã©ã³ãã³ã°çµæãªã¹ãã§ãããã®ã¢ã«ã´ãªãºã ã¯æ¤ç´¢ãªã©ã³ãã³ã°ã ãã§ãªãã¬ã³ã¡ã³ãã¼ã·ã§ã³ã®å¤æ§æ§ã®æèã§ã使ããã¾ã[2][3]ã
ç®çé¢æ°ã®å®è£ ã¯æ§ã ã§ããåæã®å®é¨ã§ã¯ The use of MMR, diversity-based reranking for reordering documents and producing summaries[4]ã§MMR(Maximal Marginal Relevance)ã¨ããææ³ãææ¡ãã¦ãããå ã ã®æ¤ç´¢ã®é¢é£åº¦ãããã§ã«ãªã©ã³ãã³ã°çµæãªã¹ãã«è¿½å ããã¦ããã¢ã¤ãã ã¨ã®é¡ä¼¼åº¦ã®ãªãã§æ大ã®é¢é£åº¦ãå¼ãããã®ã«ãªãã¾ãã
ã¯ã¢ã¤ãã ã®å ã ã®æ¤ç´¢ã¨ã³ã¸ã³ä¸ã§ã®ã¹ã³ã¢(ä»åã¯BM25ã®å¤)ã§ããã¯ã¢ã¤ãã ã¨ã®é¡ä¼¼åº¦ã表ãã¾ããã¯èª¿æ´å¤ã§ãããé¢é£åº¦ãå¤æ§æ§ã©ã¡ããéè¦ãããã調æ´ã§ãã¾ãã
ã¾ããå ã«ç¤ºããè«æ[1]ãªã©ã§ã¯ãªã©ã³ãã³ã°çµæãªã¹ãã«ãã§ã«å«ã¾ãã¦ããã¢ã¤ãã ã¾ã§ã®è·é¢ã®å¹³åãç¨ããææ³ãç´¹ä»ããã¦ãã¾ãã
ã¯ã¢ã¤ãã ã¨ã®è·é¢ã表ãã¾ãã2ã¤ã®å¼ã§ä½¿ã£ã¦ããé¡ä¼¼åº¦é¢æ°ãéãã®ã§æ³¨æãã¾ããããã¯é¡ä¼¼ãã¦ããã»ã©å¤ãé«ããªããã¯é¡ä¼¼ãã¦ããã»ã©å¤ãä½ããªãã¾ãã
äºçªç®ã«ç´¹ä»ããç®çé¢æ°ã®æ¹ããªã©ã³ãã³ã°çµæãªã¹ãå ¨ä½ãèæ ®ã§ãã¾ããã1ã¤ã®é常ã«ä¼¼ã¦ããã¢ã¤ãã ã®åå¨ã軽è¦ãã¦ãã¾ãæããããã¾ãã
ä»ã®ç®çé¢æ°ã«ã¤ãã¦ç¥ãããå ´åã¯ãDiversityã«ã¤ãã¦ã¾ã¨ã¾ã£ã¦ããè«æ[1]ãããããã§ãã
ãªã©ã³ãã³ã°å®è£
ä»åã¯å ã»ã©ç´¹ä»ããã¢ã¤ãã ã¾ã§ã®å¹³åè·é¢ã使ã£ãç®çé¢æ°ãæ¡ç¨ãã¦Pythonã§å®è£ ãã¦ã¿ã¾ãã
ã¢ã¼ããã¯ãã£
ä»åã¯Elasticsearchã®çµæãMMRã§ãªã©ã³ãã³ã°ãã¦çµæãè¿ã2段éæ§æã試ãã¾ãã
ä»åã¯Elasticsearchãæ¤ç´¢ããçµæãªã¹ãã®ãªã©ã³ãã³ã°ãPythonã§å®è£ ãã¦ã¿ã¾ãã Elasticsearchããã®çµæã¯100件åå¾ããããããå¤æ§åãã20件ã®çµæãæ½åºãã¦è¿ããã¨ãèãã¾ãã
åæºå
ä»åã®å®è£ ã§ã¯ä¸è¨ã®ã¢ã¸ã¥ã¼ã«ãå©ç¨ãã¾ããPythonã¯3.9.0ãå©ç¨ãã¾ãã
from abc import ABC, abstractmethod import pandas as pd import requests from requests.auth import HTTPBasicAuth from sudachipy import tokenizer from sudachipy import dictionary from sklearn.feature_extraction.text import TfidfVectorizer from sklearn.metrics.pairwise import cosine_similarity
Pythonã§Elasticsearchã«ãªã¯ã¨ã¹ããã¦ããã¥ã¡ã³ããåå¾ãã¾ãããã®éã«å©ç¨ãã¦ããã©ã³ãã³ã°ã¢ã«ã´ãªãºã ã¯BM25ã§ãã
data = {"from": 0, "size": 100, "query": {"match": {"title": "ã³ãã ã¯ã¯ãã³"}}} res = requests.get( "localhost:9200", timeout=10, json=data, auth=HTTPBasicAuth("user", "pass"))
ããããã®ã¡ã«ä½¿ãããã®max_scoreã¨titleã®ãªã¹ããidã®ãªã¹ããçæãã¦ããã¾ãã
docs = res.json()['hits']['hits'] max_score = docs[1:][0]['_score'] docs = { d['_id']: { 'title': d['_source']['title'], 'body': d['_source']['body'], 'score': d['_score'] } for d in docs } ids = list(docs.keys())
ããã§Elasticsearchããè¿ã£ã¦ããçµæããªã©ã³ãã³ã°ããæºåãã§ãã¾ããã
TF-IDF vector
ç¶ãã¦ããã¥ã¡ã³ãã®é¡ä¼¼åº¦ãè¨ç®ããã¯ã©ã¹ãç¨æãã¾ããã¾ãã¯TF-IDFãã¯ãã«ã使ã£ãé¡ä¼¼åº¦è¨ç®ã試ãã¦ã¿ã¾ããTF-IDFã®å½¢æ ç´ ãåå¾ããããã«ä»åã¯å½¢æ ç´ è§£æå¨ã®Sudachiã使ãã¾ããå¾ã»ã©å¥ã®é¡ä¼¼åº¦è¨ç®ã試ãã®ã§ãæ½è±¡ã¯ã©ã¹ã¨ãã¦å®è£ ãã¦ããã¾ãã
mode = tokenizer.Tokenizer.SplitMode.B tokenizer_obj = dictionary.Dictionary().create() def tokenize(text: str): return [m.surface() for m in tokenizer_obj.tokenize(text, mode)] class Similarity(ABC): @abstractmethod def calcurate(self, doc1: str, doc2: str): pass class TFIDFSimilarity(Similarity): def __init__(self, corpus: list): self.vectorizer = TfidfVectorizer(smooth_idf=True) noun_corpus = [' '.join(tokenize(c)) for c in corpus] self.vectorizer.fit_transform(noun_corpus) def calcurate(self, doc1: str, doc2: str): targets = [' '.join(tokenize(d)) for d in [doc1, doc2]] vecs = self.vectorizer.transform(targets) return cosine_similarity(vecs, vecs)[0][1]
試ãã«å®è¡ãã¦ã¿ã¾ãããã
corpus get_random_docs() # TF-IDFãã¯ãã«å¦ç¿ç¨ã®ããã¥ã¡ã³ãã®ã©ã³ãã ãµã³ããªã³ã°10000件 sim = TFIDFSimilarity(corpus=corpus) print(sim.calcurate('ç æ°ã«ãªã', 'ç æ°ã«ãªã')) # 1.0 print(sim.calcurate('ç æ°ã«ãªã', 'ç æ°ãç´ã')) # 0.79786 print(sim.calcurate('ç æ°ã«ãªã', 'ããæ¤è¨ºãåãã¾ãã')) # 0.0
ã¡ããã¨ä¼¼ã¦ãããã®ã®å¤ã大ããåºã¦ãã¾ããã
ç¶ãã¦ããªã©ã³ãã³ã°ã®è²¬åãæã¤Rerankerã¯ã©ã¹ã®å®è£ ãã¾ãã
class Reranker(): def __init__(self, docs, max_score, similarity: Similarity): self.docs = docs self.max_score = max_score self.similarity = similarity def es_score(self, id) -> float: doc = self.docs[id] return doc['score'] / self.max_score def dist(self, i: int, j: int): text1 = self.docs[i]['title'] text2 = self.docs[j]['title'] return 1 - self.similarity.calcurate(text1, text2) def sum_dist(self, item: int, results: list[int]) -> int: return sum(self.dist(item, result) for result in results) def select_item_id(self, candidate_items: list[int], results: list[int], alpha=0.5) -> int: max_score = -1 max_score_item_id = -1 for i in candidate_items: if len(results) == 0: return i score = alpha * self.es_score(i) + (1 - alpha) * self.sum_dist( i, results) / len(results) if score > max_score: max_score = score max_score_item_id = i return max_score_item_id def greedy_reranking(self, candidate_items: list[int], n: int, alpha=0.5): results = [] while len(results) < n: doc_id = self.select_item_id(candidate_items, results, alpha) results.append(doc_id) candidate_items.remove(doc_id) return results
å ã»ã©å®è£ ããSimirarityã¯ã©ã¹ã使ã£ã¦Rerankerã¯ã©ã¹ã®åæåãè¡ãã¾ãã
corpus get_random_docs() # TF-IDFãã¯ãã«å¦ç¿ç¨ã®ããã¥ã¡ã³ãã®ã©ã³ãã ãµã³ããªã³ã°10000件
sim = TFIDFSimilarity(corpus=corpus)
reranker = Reranker(docs, max_score, sim)
ããã§æºåå®äºã§ããæ©éå®è¡ãã¦ã¿ã¾ãã
# before print(pd.DataFrame([docs[i]['title'] for i in ids[:20]])) results = reranker.greedy_reranking(ids, n=20) # after print(pd.DataFrame([docs[i]['title'] for i in results]))
beforeã¯èª²é¡ã®ç« ã§ãè¦ããããªã¹ãã§ãããªã©ã³ãã³ã°ã®çµæã¯ä¸è¨ã«ãªãã¾ãã
ã³ããã¦ã£ã«ã¹ã¯ã¯ãã³ ã³ããã¦ã¤ã«ã¹ã¨ã¤ã³ãã«ã¨ã³ã¶ã¯ã¯ãã³ ã³ããã¯ã¯ãã³ãã¤ã¾ã§ ã³ããã¯ã¯ãã³ã£ã¦ï½¡ï½¡ ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ã®æä¹³ ã³ããã¦ã£ã«ã¹ã¯ã¯ãã³ã®ã¡ãªãã ã³ããã¯ã¯ãã³ã§ãããï¼ ã³ããã¦ã£ã«ã¹ã®ã¯ã¯ãã³ ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ã¨ã¤ã³ãã«ã¨ã³ã¶ã¯ã¯ãã³ ãã«ã¨ã³ããã»ã³ããã¯ã¯ãã³ ã³ããã¯ã¯ãã³ã«é¢ã㦠ã³ããã¦ã£ã«ã¹ã¯ã¯ãã³ ã³ããã¯ã¯ãã³ã風ç¹ã¯ã¯ãã³ ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³æ¥ç¨® 赤ã¡ããã®ã¯ã¯ãã³ã¨ã³ããã¦ã£ã«ã¹ ã³ãããã³ããã¯ã¯ãã³å¾®ç± ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ ã³ããã¦ã£ã«ã¹ã¯ã¯ãã³æ¥ç¨® ã¤ã³ãã«ã¨ã³ã¶ã¯ã¯ãã³ã¨ã³ãã
ãªã©ã³ãã³ã°åã®çµæãããã¿ã¤ãã«ã«å¤æ§æ§ãåºã¦ããã®ããããã¾ãããã¡ãã®çµæãæ¤ç´¢çµæãã¼ã¸ã«åºããæ¹ããã¦ã¼ã¶ã¼èªèº«ãã¯ãªãã¯ãã質åãé¸ã³ããããªãã§ãããã
çµæã¯è¯ãæãã§ããããã®greedy_reranking
ã®å®è¡ã«ã¯VSCodeã®Jupyterç°å¢ã§25.3sããã£ã¦ãã¾ãã¾ãããæ¤ç´¢ã«ããããªã©ã³ãã³ã°ã¯ããã©ã¼ãã³ã¹ãéè¦ãªã®ã§ããã¯åé¡ã§ããä¸åº¦è¨ç®ãããã¯ãã«ããã£ãã·ã¥ãã¦ããã¦ãããå©ç¨ãããããã°ããã£ã¨é«éåã§ãããã§ããä¸æ¹ã§ãã£ã¨è»½éãªé¡ä¼¼åº¦è¨ç®ãæ¡ç¨ããæãããã¾ãã
Jaccardä¿æ°
ãã軽éã«é¡ä¼¼åº¦è¨ç®ããããã«ãJaccardä¿æ°ãèãã¾ããè¨äºã¿ã°ãåèªãªã©è²ã è¦ç´ ã¨ãã¦æ¡ç¨ã§ãããã®ã¯ããã¾ãããä»åã¯n-gramãæ¡ç¨ãã¾ããçç±ã¯ç´æ¥ã¿ã¤ãã«ãå½¢æ ç´ è§£æä¸è¦ã§æ±ããããã§ããn-gram Jaccardä¿æ°ã¯ä¸è¨ã§è¨ç®ã§ãã¾ãã
ã¯ããã¥ã¡ã³ãããå¾ãããn-gramã表ãã¾ãã
class NgramJaccardSimilarity(Similarity): def n_gram(self, txt, n): return [txt[idx:idx + n] for idx in range(len(txt) - n + 1)] def jaccard_distance(self, a, b): a = set(a) b = set(b) return 1.0 * len(a & b) / len(a | b) def calcurate(self, doc1: str, doc2: str): return self.jaccard_distance(self.n_gram(doc1, 2), self.n_gram(doc2, 2))
ãã®å®è£ ã使ã£ã¦ãªã©ã³ãã³ã°ãå®è¡ããã¨ä¸è¨ã®çµæã«ãªãã¾ãã
ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ ã³ããã¯ã¯ãã³ã¨ã¤ã³ãã«ã¨ã³ã¶ã¯ã¯ãã³ ã³ããã¯ã¯ãã³ã®ãã¨ã§ã ã³ããã®ã¯ã¯ãã³ã¨ãããµãã®ã¯ã¯ãã³ ã³ããã¦ã£ã«ã¹ã®ã¯ã¯ãã³ã«ã¤ã㦠ã³ããã¯ã¯ãã³ããã¦ã ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ã¨ãµãã¬ãã¯ã¹ãªã© ã¯ã¯ãã³ãã³ããã¯ã¯ãã³ ã³ããã¦ã¤ã«ã¹ã®ã¯ã¯ãã³ã¨ã¤ã³ãã«ã¨ã³ã¶ã®ã¯ã¯ãã³ã«ã¤ã㦠ã³ããã¯ã¯ãã³ãã¤ã³ãã«ã¨ã³ã¶ã¯ã¯ãã³ ã³ããã¦ã¤ã«ã¹ã®ã¯ã¯ãã³ ã³ããã¯ã¯ãã³ã®ãæ¬å½ãã¯ï¼ ã³ããã®ã¯ã¯ãã³ ã³ããã¦ã£ã«ã¹ã¯ã¯ãã³ã«ã¤ã㦠ã³ããã¨æ¹¿æ°ã¨ã¯ã¯ãã³ ã³ããã¦ã¤ã«ã¹ã¯ã¯ãã³ ã³ããã¯ã¯ãã³ã¨ã¤ã³ãã«ã¯ã¯ãã³ ã³ããã¦ã¤ã«ã¹ã®ã¯ã¯ãã³ ã³ããã¯ã¯ãã³ã¨ã¤ã³ãã«ã¨ã³ã¶ã¯ã¯ãã³ ã³ããã¦ã£ã«ã¹ã¯ã¯ãã³ã«ã¤ãã¦
æªããªãããã§ãããããå®è¡æéã¯200msã§ããæ¤ç´¢ã¨ã³ã¸ã³ããã®åè£100件ã¨ããæ°ãæ¸ããã°ããå°ãéãã§ãããã§ããããããããªããªã©ã³ãã³ã°å®è£ ã¨ãã¦APIã«çµã¿è¾¼ããã®ãªã®ãªã®éãã§ããããã
æ¬å®è£ ã®èª²é¡
ãã®å®è£ ã ã¨ãã¼ã¸ã³ã°ãã§ããªãã®ãåé¡ã§ããæ¤ç´¢ã¨ã³ã¸ã³ããåè£100件ãåã£ã¦ãã¦ããã®ä¸ãã20件ã«çµã£ã¦ããä»åã®ãããªå®è£ ã ã¨ãæ¤ç´¢ã¨ã³ã¸ã³ã¯åè£ãåºåããã ããªã®ã§ãæ¤ç´¢ã¨ã³ã¸ã³ã«å¯¾ãã¦ãã¼ã¸ã³ã°ãã§ãã¾ããã
ãããã¼ã¸ã³ã°ãããããªããåãã¼ã¸ã§è¡¨ç¤ºããããã¥ã¡ã³ããexcludeããæ¤ç´¢ãè¡ã£ã¦100件åå¾ãå¿ è¦ã§ãã
ã¾ã¨ã
ä»åã¯æ¤ç´¢çµæå¤æ§æ§åä¸ã®ããã®ã¢ã«ã´ãªãºã ãç´¹ä»ãã¾ãããå°å ¥ããéã¯ããã©ã¼ãã³ã¹ããã¼ã¸ã³ã°ãªã©ã®è¦ä»¶ã確èªãã¦æ¡ç¨ããã¨ããã§ããããå¼ç¤¾ã§ã¯å¤æ§æ§åä¸ãROIçã«ä½ããã¨ãããã£ã¦ããã®ã§ã¾ã å®è£ ã«è³ã£ã¦ã¾ãããããã¤ãæ¤ç´¢ã¨ã³ã¸ã³ãæ¨è¦ã·ã¹ãã ã®å¤æ§æ§åä¸ãããéã«ã¯è²ªæ¬²ãªãªã©ã³ãã³ã°ææ³ã¯ä½¿ããããªæ¦å¨ã ã¨æãã¾ããã
We're hiring !!!
ã¨ã ã¹ãªã¼ã§ã¯æ¤ç´¢&æ¨è¦åºç¤ã®éçº&æ¹åãéãã¦å»çãåé²ãããã¨ã³ã¸ãã¢ãåéãã¦ãã¾ãï¼ ç¤¾å ã§ã¯æ¥ã æ¤ç´¢ãæ¨è¦ã«ã¤ãã¦ã®è°è«ãæ´»çºã«è¡ããã¦ãã¾ãã
ãã¡ãã£ã¨è©±ãèãã¦ã¿ãããããã¨ãã人ã¯ãã¡ãããï¼ jobs.m3.com