找到你要的答案

Q:Can my code be parallelized with python multiprocess?

Q:可我的代码并行Python多进程?

if __name__ == "__main__":
    content = input_file(target).split("\n")
    content = manager.list(content)
    for files in source:
        obj_grab.append((LogCatcher(files), content))
    pool = Pool()
    pool.map(transferConcat, obj_grab)
    pool.close()
    pool.join()

def concatMessage(LogCatcher, content):
    for key in LogCatcher.dic_map:
        regex = re.compile(key)
        for j in range(len(content)):
            for m in re.finditer(regex, content[j]):
                content[j] += LogCatcher.index + LogCatcher.dic_map[key]

def transferConcat(args):
    return concatMessage(*args)

When I use pool.map(), it takes 82 secs to finish this code.

While I use normal for loop to run this code, it takes 11 secs.

It must be something wrong with pool.map() while I got the right output with these two methods.

Can my code be parallelized?

obj_grab is a list, which contains logCatchers with different file inputs. content is the list I want to concatenate, and I use manager() to let multiprocess concat the same list.

if __name__ == "__main__":
    content = input_file(target).split("\n")
    content = manager.list(content)
    for files in source:
        obj_grab.append((LogCatcher(files), content))
    pool = Pool()
    pool.map(transferConcat, obj_grab)
    pool.close()
    pool.join()

def concatMessage(LogCatcher, content):
    for key in LogCatcher.dic_map:
        regex = re.compile(key)
        for j in range(len(content)):
            for m in re.finditer(regex, content[j]):
                content[j] += LogCatcher.index + LogCatcher.dic_map[key]

def transferConcat(args):
    return concatMessage(*args)

当我用池。map(),它需要82秒来完成这个代码。

虽然我使用循环来运行此代码正常,它需要11秒。

它必须是错的东西map()池。当我得到了正确的输出,用这两种方法。

可我的代码并行化?

obj_grab是一个列表,其中包含不同的文件输入logcatchers。内容是我要连接的列表,我用manager()让多进程连接相同的列表。

python  parallel-processing  multiprocessing