Python Multi-threader command executor

Ever need to run several commands at once (lets say a suite of nmap scans?). The answer is here my friends; Here’s a simple python multi-threaded script that will execute commands based on a file and the number of threads to use:

$> cat commands.txt
nmap -sS -sV -T4 192.168.0.1/24 -p- -oX tcp_full.xml
nmap -sU -T4 192.168.0.1/24 -oX udp.xml

The file is then run with the desired amount of threads (two here in this case):

$> ./multithreader.py commands.txt 2 
Starting Thread: 0 
nmap -sS -sV -T4 192.168.0.1/24 -p- -oX tcp_full.xml by: Thread 0 
Starting Thread: 1 
nmap -sU -T4 192.168.0.1/24 -oX udp.xml by: Thread 1 

Starting Nmap 7.01 ( https://nmap.org ) at 2016-10-31 00:00 GMT
Starting Nmap 7.01 ( https://nmap.org ) at 2016-10-31 00:00 GMT

This should help with some work-flow automation, and at worst, give some skeleton code for working with threads in python:

#!/usr/bin/python
import threading,sys,Queue,os

def startup():
    # check for system args
    if len(sys.argv) != 3:
        print "Usage: multithreader <file> <thread_count>\n"
        exit()

def worker():
    global id
    name=id
    id=id+1
    print "Starting Thread:",name,"\n"
    while True:
        # get item in queue
        item = q.get()
        print item,"by: Thread ",name,"\n"
        # run item as system command
        os.system(item)
        q.task_done()
    print "Killing Thread:",name,"\n"

def main():
    startup()
    global q,id
    id=0
    f = open(sys.argv[1])
    q = Queue.Queue(maxsize=0)
    # populate queue from file
    for item in f:
        q.put(item.strip())
    f.close()
    # create and start workers
    for i in range(int(sys.argv[2])):
        t = threading.Thread(target=worker)
        t.daemon = True
        t.start()
    # join the queue to start
    q.join()
main()

 

Happy threading!

Leave a comment