Skip to content

[Config] Add configuration file#359

Merged
yaoyaoding merged 13 commits intohidet-org:mainfrom
Aalanli:config-file
Sep 27, 2023
Merged

[Config] Add configuration file#359
yaoyaoding merged 13 commits intohidet-org:mainfrom
Aalanli:config-file

Conversation

@Aalanli
Copy link
Copy Markdown
Contributor

@Aalanli Aalanli commented Sep 26, 2023

Auto generates the following config file at ~/.config/hidet/hidet.toml, which the user can then edit to customize the config.

Setting hidet.option.... in a script does not change this file.

# The (warmup, number, repeat) parameters for benchmarking. The benchmarking will run warmup + number * repeat times.
bench_config = [3, 10, 3]

# The search space level.
#   choices: [0, 1, 2]
search_space = 0

# Whether to enable operator cache on disk.
#   choices: [True, False]
cache_operator = true

# The directory to store the cache.
cache_dir = "/home/allan/Programs/hidet_repo/hidet/.hidet_cache"

# Whether to build operators in parallel.
#   choices: [True, False]
parallel_build = false

# The pair (max_parallel_jobs, mem_gb_per_job) that describe the maximum number of parallel jobs and memory reserved for each job
parallel_tune = [-1, 1.5]

# Whether to save the IR when lower an IRModule to the operator cache.
#   choices: [True, False]
save_lower_ir = false

# Whether to cache the generated kernels during tuning.
#   choices: [True, False]
debug_cache_tuning = false

# Whether to show the variable id in the IR.
#   choices: [True, False]
debug_show_var_id = false

# Whether to check shapes of compiled graph and tasks during execution.
#   choices: [True, False]
runtime_check = true

# Whether to show the verbose flow graph.
#   choices: [True, False]
debug_show_verbose_flow_graph = false

# The address of the compile server. Can be an IP address or a domain name.
"compile_server.addr" = "localhost"

# The port of the compile server.
"compile_server.port" = 8329

# Whether to enable the compile server.
#   choices: [True, False]
"compile_server.enabled" = false

# The user name to access the compile server.
"compile_server.username" = "admin"

# The password to access the compile server.
"compile_server.password" = "admin_password"

# The URL of the repository that the remote server will use.
"compile_server.repo_url" = "https://github.com/hidet-org/hidet"

# The version (e.g., branch, commit, or tag) that the remote server will use.
"compile_server.repo_version" = "main"

# The CUDA architecture to compile the kernels for (e.g., "sm_70"). "auto" for auto-detect.
"cuda.arch" = "auto"

Copy link
Copy Markdown
Member

@yaoyaoding yaoyaoding left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Allan, it looks good to me overall.

Left some suggestions on the code organization.

Comment on lines +55 to +68
doc.add(tomlkit.comment(description))
if choices is not None:
doc.add(tomlkit.comment(f' choices: {choices}'))
if isinstance(default_value, (bool, int, float, str)):
doc.add(name, default_value)
elif isinstance(default_value, Tuple):
# represent tuples are toml arrays, do not allow python lists are default values to avoid ambiguity
val = list(default_value)
arr = tomlkit.array()
arr.extend(val)
doc.add(name, arr)
else:
raise ValueError(f'Invalid type of default value for option {name}: {type(default_value)}')
doc.add(tomlkit.nl())
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider moving this logic into a standalone function like _create_toml_doc().

@Aalanli
Copy link
Copy Markdown
Contributor Author

Aalanli commented Sep 26, 2023

Now it looks like this with the table feature:

# The (warmup, number, repeat) parameters for benchmarking. The benchmarking will run warmup + number * repeat times.
bench_config = [3, 10, 3]

# The search space level.
#   choices: [0, 1, 2]
search_space = 0

# Whether to enable operator cache on disk.
#   choices: [True, False]
cache_operator = true

# The directory to store the cache.
cache_dir = "/home/allan/Programs/hidet_repo/hidet/.hidet_cache"

# Whether to build operators in parallel.
#   choices: [True, False]
parallel_build = true

# The pair (max_parallel_jobs, mem_gb_per_job) that describe the maximum number of parallel jobs and memory reserved for each job
parallel_tune = [-1, 1.5]

# Whether to save the IR when lower an IRModule to the operator cache.
#   choices: [True, False]
save_lower_ir = false

# Whether to cache the generated kernels during tuning.
#   choices: [True, False]
debug_cache_tuning = false

# Whether to show the variable id in the IR.
#   choices: [True, False]
debug_show_var_id = false

# Whether to check shapes of compiled graph and tasks during execution.
#   choices: [True, False]
runtime_check = true

# Whether to show the verbose flow graph.
#   choices: [True, False]
debug_show_verbose_flow_graph = false

[compile_server]
# The address of the compile server. Can be an IP address or a domain name.
addr = "localhost"

# The port of the compile server.
port = 8329

# Whether to enable the compile server.
#   choices: [True, False]
enabled = false

# The user name to access the compile server.
username = "admin"

# The password to access the compile server.
password = "admin_password"

# The URL of the repository that the remote server will use.
repo_url = "https://github.com/hidet-org/hidet"

# The version (e.g., branch, commit, or tag) that the remote server will use.
repo_version = "main"

[cuda]
# The CUDA architecture to compile the kernels for (e.g., "sm_70"). "auto" for auto-detect.
arch = "auto"

@yaoyaoding
Copy link
Copy Markdown
Member

Thanks @Aalanli!

@yaoyaoding yaoyaoding merged commit 7c12631 into hidet-org:main Sep 27, 2023
@Aalanli Aalanli deleted the config-file branch September 27, 2023 18:09
vadiklyutiy pushed a commit that referenced this pull request Jul 22, 2024
vadiklyutiy pushed a commit that referenced this pull request Jul 23, 2024
vadiklyutiy pushed a commit that referenced this pull request Dec 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants